All posts by Brian Vecci

The Differences Between DLP, IAM, SIEM, and Varonis Solutions

The Differences Between DLP, IAM, SIEM, and Varonis Solutions

You can’t always do it all alone and sometimes you need help from your friends. It’s good life advice, and as it turns out, good advice for a security solution. A multi-pronged security program that uses a mix of technologies and approaches is the best way to reduce risk and to protect your organization’s most important data resources.

For example, Data Loss Prevention (DLP) solutions are often used to help protect sensitive data as it moves around the network and makes its way to endpoint devices.  Identity and Access Management (IAM) solutions complement DLP by connecting disparate authentication services together, so that when users need to access systems or applications, they make a request through a single service. And Security Information Event Management (SIEM) tools aggregate, correlate, and help analyze the logs from a variety of different sources in a single repository.

Yes, organizations often employ some or all of DLP, IAM, and SIEM in a best-of-the best approach. But what are the differences in each of these technologies, and how do they relate to Varonis, which is neither DLP, IAM, or SIEM?

Let’s go through the distinctions.

Data Loss Prevention

To prevent a user’s sensitive data from making  its way outside the corporate network, DLP solutions execute responses based on pre-defined policies and rules, ranging from simple notification to active blocking.

DLP typically covers three high level use cases:  endpoint protection, network monitoring of data in motion, and classification of data at rest.

Endpoint protection use cases include hard drive encryption, optical drive and USB port locking to prevent exfiltration, and malware protection.

Data in motion technologies inspect email and web traffic to attempt to identify sensitive data potentially being exfiltrated so that data remains in the organization, and may also help ensure that content is only accessed over encrypted channels.

Data at rest classification inspects the content of file to identify where sensitive data may exist on server and cloud platforms so that additional action can be taken to ensure proper access controls.

IAM

While DLP is great for protecting sensitive data, it generally has no information about how data is being used or how access controls are granted.  To obtain this access information, many organizations turn to Identity and Access Management.

Identity Management serves as a gatekeeper in terms of user access rights . When a user starts a new role, he gets authorized and access rights to systems and applications. And when he leaves the organization, those rights are terminated.

What makes Access Management so critical is that  access rights, especially for unstructured data, typically accumulate over a time. The longer a user stays with a company, the more access the user usually has. Users with privileges beyond what is required to perform in their current role can put the company at risk. Moreover, if a hacker gains access to the account of a user with excessive access, it might further increase the company’s risk. Both scenarios can result in data breach.

Together, Identity and Access Management ties disparate applications together into a single repository for management of access and entitlements. IAM solutions will often provide access management workflows, user entitlement reporting, application owner entitlement reviews, and even single-sign-on (SSO) functionality between applications with the goal of providing a single entitlement store and workflow solution for managing access.

SIEM

SIEM systems store, analyze, and correlate a multitude of security information, authentication events, anti-virus events, intrusion events, etc. Anomalous events observed in a rule alerts a security officer/analyst to take swift action.

SIEM systems aggregate logs, most commonly through reading event viewer data, receiving standard feeds from SNMP traps or Syslog, or sometimes get log data with the help of agents.  These feeds come from user devices, network switches and other devices, servers, firewalls, anti-virus software, intrusion detection/prevention systems, and many more. Once all of the data is centralized, it runs reports, “listens” for anomalous events, and sends alerts.

For the SIEM tool to identify anomalous events, and send alerts, it’s important that an administrator create a profile of the system under normal event conditions. SIEM alerts can be pre-configured with canned rules, or you can custom create your own rules that reflect your security policies.

After events are sent to the system, they pass through a series of rules, which generate alerts if certain conditions are met. Keep in mind, with potentially thousands of devices, and different sources to monitor, each generating potentially thousands of records or more a day, there will be plenty of data to sift through.  The goal is to use SIEM rules to reduce the number of events down to a small number of actionable alerts that signal real-world vulnerabilities, threats, or risk.

Varonis

Varonis does not provide DLP, IAM, or SIEM functionality, and is not designed to replace any of those solutions. In fact, Varonis tends to enhance each one by providing visibility into and context around the unstructured data – which can prevent insider and outsider threats, malware activity, lateral movement, data exfiltration, and potential data breaches.

What sets Varonis solutions apart from traditional file-level DLP solutions?

Identifying sensitive data on your server and/or blocking it is DLP’s strong suit. Yes, it  knows where all your sensitive files reside, but it has a weak point:  if a hacker or insider compromises an account who is authorized to access sensitive docs, DLP can’t stop it.

To really protect your organization’s sensitive data, you should also know:

  • who is accessing it
  • who has access to it
  • who likely no longer needs access
  • who outside of IT the data belongs to, and
  • also when a user or users start accessing that data in strange ways.

Varonis makes DLP better by providing all of that additional context. After absorbing the classification scans from DLP, Varonis provides activity monitoring, alerting, and behavior analysis along with intelligent permissions management. DLP tells you where your sensitive data is, and Varonis helps make sure that only the right people have access to it and that you know when access is abused.

What sets Varonis solutions apart from IAM solutions?

Even though IAM connects various applications and systems into a single solution for entitlements, that functionality tends to stop when it comes to unstructured data. Because access to unstructured data is controlled both by directory users and groups and file system ACLs together, there’s no single “application” for IAM to connect to. This means that IAM has a blind side when managing access to unstructured data.

Moreover, access to unstructured data tends to be chaotic and unmanaged—permissions are complex and not standard, multiple groups often have access to data, folders and SharePoint sites are open globally, etc—managing unstructured data entitlements through IAM is often impossible.

This is where Varonis can help.

DatAdvantage allows IAM to extend to unstructured data through many use cases:

  • Map out the functional relationships between the users/groups, and the data necessary for a role.
  • Restructure permissions so that they can be efficiently managed through single purpose groups.
  • Analyze user behavior over time and provide recommendations to owners on who likely no longer needs access
  • Leverage data classification to help ensure sensitive data is owned and managed appropriately

DataPrivilege can complement IAM by empowering data owners, and users by:

  • Enabling ad-hoc requests so users can get access to data, only for as long as necessary, without having to redefine a role
  • Giving data owners insight into activity on their data sets
  • Allowing for regular reviews of access to ensure only the right people have access to the right data

What sets Varonis solutions apart from SIEM?

SIEM will read event viewer logs from network devices, systems, and AD, but has no view into actual data activity since those logs often don’t exist natively and can be difficult to parse.

With our file activity monitoring system, Varonis closes this gap by collecting and analyzing all access activity on platforms SIEM can’t usually see.

We can tell your SIEM when someone’s accessing the CEO’s mailbox, changing critical GPOs, encrypting large numbers of files in a short period of time, or otherwise misbehavior when it comes to your data and directory services.

Moreover, Varonis baselines user activity and provides alerts that can be passed directly to SIEM for further correlation, analysis or action. Varonis alerts can be sent via Syslog to any SIEM, and there are pre-built templates for connection with some specific platforms.

Summing Up

DLP, IAM, and SIEM are all useful, important technologies for enterprise security. There is no single product or category that an organization needs to protect their data and systems, and defense in depth is becoming increasingly important. When it comes to unstructured data, all of these technologies have significant gaps in the kinds of detective and preventive controls they can provide, and all of them are made more useful by integrating with the Varonis Metadata Framework.

Using Varonis: “Fast Track” Recommendations

Fast Track

(This is one entry in a series of posts about the Varonis Operational Plan – a clear path to data governance.  You can find the whole series here.)

Over time, a user’s access to systems and data in an organization tends to grow, regardless of their role or responsibilities. From an IT perspective, it’s much more common  to grant permissions than it is to revoke them—when a user needs access to something they usually make a lot of noise.

Every time there’s a new project or roles change, there might be new data sets that users need access to in a hurry, and they’re certainly not shy about asking IT to unlock that data. The converse isn’t necessarily true—if a user has access to a lot of shared data that they don’t need anymore, it’s extremely rare that they will call up the help desk demanding to have their access revoked.

This means that, over time, users end up with far more access than they need, and it can be difficult, or even impossible, to manually identify this type of stale, excessive authorization.

The question: “Which users no longer need access?” is a very difficult for IT to answer without the right kind of intelligence.  DatAdvantage Recommendations offer one of the most advanced ways to answer this question.  Following our last post on the Varonis Operational Plan, the next action involves “fast-tracking” those recommendations in order to quickly reduce unneeded (and potentially dangerous) access to high-risk data sets.

We covered this feature of DatAdvantage (and its integration into DataPrivilege) in a previous post regarding Entitlement Reviews by Data Owners. Using Recommendations in this way is a great way to help maintain access over time, but it’s important to realize that getting Data Owners to the point where they’re actively maintaining their data may take some time. In order to quickly reduce risk while operationalizing the Varonis suite, the Operational Plan explains how IT can immediately commit recommended changes without first involving data owners. This technique can make a lot of sense when: the data in question is of high value and you’ve observed 4-6 weeks of access activity.

DatAdvantage Recommendations are a powerful, automated way to identify users with excessive access. By committing the recommended changes (right from the DatAdvantage interface) you can quickly ensure that access to high-risk data is safely reduced without impacting your users’ ability to collaborate. Ultimately, Data Owners should be the ones reviewing and enacting changes, but an initial audit and clean-up of current access rights is a great way to reduce risk quickly.

Image credit (public domain): pixabay

Using Varonis: Implementing Automatic Rules

Ruler

(This is one entry in a series of posts about the Varonis Operational Plan – a clear path to data governance.  You can find the whole series here.)

What good are rules if nobody follows them, right? If we put a business policy in place which dictates that only a select few users should ever have access to customer data, it’s critical to ensure that the rule is always applied.  Automation is much better at verifying and enforcing access control policies than humans are.  It’s simply not scalable to look through logs of access control changes or spot check ACLs looking for violations.

DataPrivilege is an interface into the Varonis framework that’s designed to involve and empower Data Owners while also providing the ability to automate policy-based access control rules. Following our last Varonis operational plan post, the next step is to design and implement Automatic Rules within DataPrivilege to help make sure business policies are implemented properly and automatically enforced across appropriate managed data.

Let’s walk through a simple example.

Imagine a financial services organization with both an investment bank and a retail wealth management arm. In general, these two sides of the bank should never have access to the same sets of data—customer information must be entirely isolated from the investment side of the bank. Controlling access to application and structured data is one thing, but access to unstructured data where users are constantly copying and moving files around can be a separate, and massive, problem.

From an access control standpoint, there are a couple of things that should happen in order to ensure compliance. First, if any user from the wealth management side of the bank requests access to the investment side (or vice versa), that request should be denied regardless of who approves it. Second, any time invalid access is granted, whether intentionally or not, it should be automatically and immediately revoked.

Second, any non-compliant access granted outside of the normal approval process should be reverted, fixing any rule-breaking access made configured prior to the policy being enacted, and any future non-compliant access granted outside normal workflows. These kinds of rules are a great way to automatically ensure access compliance.

DataPrivilege allows an organization to configure automatic rules for both group membership or for directory access, so access can be automatically granted or revoked based on available user attributes. Because business policies will be automatically enacted, DataPrivilege can automatically correct erroneous access without the need for administrators to manually detect and fix problems.

Image credit (cc): flickr/sterlic

Using Varonis: Involving Data Owners – Part II

(This is one entry in a series of posts about the Varonis Operational Plan – a clear path to data governance.  You can find the whole series here.)

Image of pills

If your doctor said “Your blood pressure is 120/95” would that mean anything to you?  Even if you could interpret that data as symptomatic of stage 1 high blood pressure, would it be actionable?  A helpful doctor would not only help you understand your vital stats, she’d also empower you to make informed decisions about your health.

Likewise, not only should we deliver targeted reports to data owners, we should ensure that the information is actionable and provokes intelligent, data-driven decisions.

The next step in the Operational Plan is to help owners make informed decisions about who should have access to their data, and make sure they’re decisions can be executed without bogging anyone down in paperwork. With DataPrivilege we can do exactly that.

Entitlement Reviews

One of the first actions data owners can take is to re-certify access to their data through an attestation, or entitlement review. At a high level, the owner will review the list of users who have access, and users who probably shouldn’t have access to their data, make any appropriate changes, and then commit those changes to file systems or directory services. What has typically been a very manual and time-intensive (for IT) task can be completely automated with DataPrivilege, the internal web-based interface into the Varonis Metadata Framework.

Once configured, DataPrivilege Entitlement Reviews offer automatic, web-based forms delivered on a regular basis that show data owners exactly who has access to their data, highlighting any users that DatAdvantage recommends for removal based on its automated analysis. These recommendations show owners those users who have likely moved on to other roles, left the company, or were added by mistake.  Varonis’ recommendation engine is like the doctor with extremely trustworthy advice on how to immediately improve your health.

These entitlement reviews can be set up for data sets—reviewing the users with access to a specific folder or share—and/or for security groups or mail-enabled distribution lists. This means an organization is able to effectively shift the burden for access reviews for all data to its rightful owner, as well as leverage the same system for application and other group reviews.

Authorization Workflow

While entitlement reviews are key to correcting and maintaining access controls, it’s also important to involve owners at the “point of sale,” when access is initially requested by a user. Traditionally, access control approval has often come from the manager of the requesting user, a group owner that may or may not be aware of what data that group grants access to, or IT rather than the actual Data Owner. This is a problem, since that’s not usually the person who has the best context to make good access control decisions.  To continue our metaphor—it’s like allowing the pharmacy decide which medicine we should take.

DataPrivilege changes this model by offering an authorization workflow that puts decisions into the hands of owners and their designated delegates. A big part of operationalizing DataPrivilege is transitioning this approval process from IT to the end users and owners themselves. It can mean significant operational resource gains for IT as well as a higher level of service and data protection.

Self-Service Portal

The last thing I want to mention about DataPrivilege is the Self-Service Portal, which allows Data Owners to get information and make decisions on-demand. The DataPrivilege portal lets owners see—at any time—information about their data, including permissions, log information and statistics.

We’ve found that many of our customers have seen impressive results once they deploy the portal to their users. If you give owners information about their assets and the ability to make decisions, they tend to use it. The Self-Service Portal is another way IT can shift the management burden to owners themselves.

Empowering owners to implement policy is a great first step, but Data Privilege also offers the ability to automate a lot of this work. The next step in the Varonis Operational Plan involves setting up and deploying automatic rules. Stay tuned!

Image credit: flickr – epsos

Using Varonis: Involving Data Owners (Part I)

(This one entry in a series of posts about the Varonis Operational Plan – a clear path to data governance.  You can find the whole series here.)

Almost every organization is now data driven. With all the talk about data growth and big data analytics over the past couple of years, people have started to ask: “How do we maximize the value of our data? How can we make sure we’re deriving real business benefit?”

The keys to maximizing the value of our data are to gather the right intelligence about it, and then give the right people the ability to take action using the intelligence you’ve gathered.

Now that we know who our Data Owners are, it’s time to start getting them involved. Remember that it’s the owners—not IT—that have adequate context to make decisions about who should and shouldn’t have access to their assets.

The next step in operationalizing Varonis is to provide owners intelligence about their data assets.  DatAdvantage can deliver data-driven reports that shed light on what is happening with their data: who can access it, what they’re doing with it, which data is stale, etc. These reports greatly simplify and optimize reporting by delivering reports to all owners which contain information about only the data they own.

An Example

Say you’ve spent a few weeks identifying and confirming business owners for all of the top-level folders on a large NAS (or two, or three…). Depending on the size of the company, this might be a few dozen or a few thousand people. One of the most common next steps is to provide permissions reports on all of these data sets to the relevant owners. So the HR owner gets a report on all of the users who have access to the HR folder, for instance. It’s the same with Finance, Marketing, R&D, etc. In the past, you would have to create and deliver a separate report for each owner, which depending on the complexity of your reporting process might be an onerous undertaking all by itself. DatAdvantage gives you a far better alternative.

In DatAdvantage, to accomplish the same thing, you’d only need to create a single report, and all owners would get permissions reports once a quarter (or however often you like). Create the report, include the proper filters and formatting, and then set up a data-driven subscription to be delivered on the first day of the first month of the quarter. That’s it you’re done.

Every quarter, every data owner is going to get that report in their inbox, and the report will contain information about only the data that they own—they won’t see anything that doesn’t belong to them. As you add and change owners over time, the subscription will continue to work without intervention. If my job role changes and suddenly I’m the owner of additional folders, my permissions report will show those as well. If I’m no longer an owner, my report won’t contain information about what I no longer own.

Permissions reporting is a great use case for data driven reports, and it’s not the only one. Reports that show actual access can be useful, too.  What if every data owner could see exactly who on their team was accessing data most? What about those people who weren’t accessing any? Or people from outside their team bumbling around?  Who creates content? Showing owners what data is stale or which folders are growing the fastest can help give them understanding of how their using resources. Providing owners intelligence about where their sensitive data is, where it’s exposed, and who has been accessing it lead to informed decisions about how they can reduce risk.

Once you’ve started putting intelligence into the hands of your owners, the next step is to give them the power to take action without bugging IT. We’ll cover that next.

Top 3 SharePoint Security Challenges

The rapid adoption of SharePoint has outpaced the ability of organizations to control its growth and enforce consistent policies for security and access control. The ease with which SharePoint sites can be created means that SharePoint use is decentralized and often outside the purview of IT departments, security personnel and even dedicated SharePoint administrators.

So what are the top 3 SharePoint security challenges?

1 – Organic and chaotic deployment of SharePoint sites

Pervasive departmental use of SharePoint means that all types of data makes its way into SharePoint repositories. This can range in sensitivity and importance and may easily include human resources or product information. So, now the problem for organizations becomes not only identifying sensitive data but locating all SharePoint sites, existing and emerging.

2 – Ad hoc, complex permissions administration

The levels and types of permissions available with SharePoint are more complex than their NTFS counterparts, and the additional granularity and inheritance complexity creates more access levels and a high probability for erroneous or overly permissive access.

While access control decisions may be (rightly) left to the data owners through SharePoint’s permissions workflow, the complexity of its implementation often leads to inconsistency in ACL configuration and group assignment. Without strict auditing and oversight, permissions may be set in conflict with enterprise-level access policies, and may not include key business intelligence about why the access should be limited (e.g., content might be regulated or copyright protected).

3 – Limited, resource-intense auditing

Key to maintaining good access control over data is continuous monitoring of how data is being used. This is another challenge with a SharePoint environment. Microsoft SharePoint audit detail is geared toward helping site administrators manage content, not toward refining access policy. Consequently there is no way for SharePoint administrators to easily establish which users took what action on data.

The native auditing capabilities are also limited in terms of scalability across sites. “Normalizing” the data, i.e., creating a unified and accurate view of data use and access across sites and locations, is challenging and time-intensive. Exacerbating the problem is that files on SharePoint often make their way to other platforms like file shares and email – without a unified audit trail of activity, understanding how and by whom data is accessed in the collaborative environment can be a significant challenge.

Download our FREE guide to learn how to make sense of SharePoint permissions & lock down and monitor your sensitive data.

Using Varonis: Who Owns What?

(This one entry in a series of posts about the Varonis Operational Plan – a clear path to data governance.  You can find the whole series here.)

All organizational data needs an owner. It’s that simple, right? I think most of us would be hard pressed to argue against that as a principle—the data itself is an organizational asset, so of course it’s not the Help Desk or AD Admin folks who own it, it’s the users or business units that should own it. Of course, that’s great in theory, but with 1, 5, 10, or even 20 years’ worth of shared, unstructured data, figuring out who owns data is far from simple, let alone involving those owners in any meaningful way.

Before we get into using Varonis to locate owners, I want to talk about why finding a single data owner can be such a problem. IT probably knows who owns the Finance folder.  It’s the CFO or a delegated steward. Same with HR, Marketing or Legal—these tend to be clearly-delineated departmental shares and it’s not hard to figure out whom to go to if we need an informed decision. (Regularly involving those owners in data governance is a different problem, and one I will cover in future posts.)  The identification for these folders is relatively straightforward.

But what happens if you need to find the owner of a folder that has a less obvious name? What if the folder’s name is a project ID, or an acronym of some kind? In my experience, a majority of unstructured data resides in folders that aren’t obviously owned by anyone.

What IT tends to do then is a few different things:

  • Check the ACL and see which groups have access. If it’s a single group with an obvious owner, that’s a likely candidate. If the ACL contains many different groups or a global access group like Domain Users, though, this tactic tends to fail.
  • Check the Windows owner under Special Permissions. This metadata can be helpful, but can also be a red herring since it’s often just set to the local Administrator of the server. Even if there’s actually a human user there (who likely created the folder), that value may be outdated or inaccurate.
Special Permissions Dialog

  • Check the owner of files within the folder. Same problems as above.
File Properties Dialog
  • Enable operating system auditing to identify the most active user. Anyone out there excited about turning on file level auditing in Windows? I have yet to talk to anyone who answers yes to this question because of the performance hit on the server as well as the storage required and expertise to parse the logs effectively.
  • Turn off access and see who complains. Not an optimal strategy when it comes to critical data.
  • Email the world and hope for a response. In general, people don’t want to take ownership of something without good reason, since it may mean more work. How confident are you that the proper owners (who may be at a management or director level) are going to know exactly which data sets their teams are using regularly? If they’re not sure, are they going to jump to take responsibility?

So finding owners is hard, let alone finding owners at scale. If you’ve got thousands of unique ACLs and you want owners for all of them (or at least the ones that make sense) you’re going to have to go through some version of this process for each one. It’s no wonder we haven’t done a good job of this over time. Thankfully, there’s a better way.

Step 4: Identify Data Owners

The key difference between attempting to solve this problem manually and attacking it intelligently with Varonis is the DatAdvantage audit trail. A normalized, continuous, non-intrusive audit record of all data access is a key piece of DatAdvantage, and it allows us to actually identify data owners at scale without having to hunt and peck. Once you start gathering usage data and rolling it up into high level stats you can start to see the likely owners of any data set, not just the obvious ones.

DatAdvantage gives you two straightforward ways to get this information: First, we can quickly take a look at a high-level view of a single folder within the Statistics pane of the DatAdvantage GUI. This will show us the most active users of a particular folder. We like to say that at most, you’re one phone call away, since if the most active user isn’t the data owner, they almost certainly know who is.

You can operationalize this process even further by creating a statistics report, which can be run on an entire tree or even a server. A single report can show the top users of every unique ACL, and it’s possible to set up advanced filters to make this even more useful—showing only users outside of IT or in a specific OU, for example. You can even add additional properties from AD to the report, showing each user’s department or line manager, if available. None of this is possible without constantly gathering access activity and providing an interface to combine it with other available metadata.

Identifying owners is useful, but actually involving them is where IT can really start to make headway when it comes to ongoing governance. We’ll tackle that next.

Using Varonis: Which Data Needs Owners?

(This one entry in a series of posts about the Varonis Operational Plan – a clear path to data governance.  You can find the whole series here.)

Which Data Needs Owners?

In a single terabyte of data there are typically around 50,000 folders or containers, about 5% of which have unique permissions. If IT were to set a goal of assigning an owner for every unique ACL, they’d need to locate owners for 2,500 folders. That’s quite daunting. And most organizations aren’t dealing with a single terabyte of data; in fact, many enterprise installations we encounter are dealing with multiple petabytes of unstructured data. Clearly we need a more surgical approach to assign owners.

Varonis tackled this problem with a longtime customer who needed to identify and assign owners for more than 200 terabytes of CIFS data on their fleet of NetApp filers. There were about 40,000 users in the company, approximately 3,000 of which (as it turned out) needed to be as designated owners for some data.

When we started taking a close look at specific folders, we discovered that many of them (especially at the top of the hierarchy) simply didn’t need an owner; the only users who could read or write data, according to the ACL, were either services accounts or administrative/IT.

What we needed was a methodology for locating the folders where business users had access and a way to identify the likely owner for just those folders. So that’s what we built.

The logic went like this:

  • Identify the topmost unique ACL in a tree where business users have access.
  • If that ACL’s permissions allow write access to users outside of IT, it’s considered a “demarcation point.”
  • For what’s left, identify higher-level demarcation points where non-IT users can only read data.
  • For each demarcation point, identify the most active users
  • Correlate active users with other metadata, such as department name, payroll code, managed by, etc.

The end result of this process is that each demarcation point has a likely ownership candidate. For this particular customer, the next step was to go through a survey process to confirm ownership of each demarcation point with the likely owners (as determined by Varonis’ reports). Any data without a confirmed owner was locked down to remove non-IT access and underwent a separate disposition process.

Other customers have since added content classification and other risk factors in order to better prioritize the data ownership assignment process. With a good classification scheme in place, IT is able to start assigning owners to the most critical data first.

The key takeaway from this process is we can use DatAdvantage to quickly identify the folders that need owners as well as likely owners, so IT doesn’t need to make decisions about 2500 folders per terabyte of data.

While this report was a originally a customization for one customer, we’ve now baked it right into DatAdvantage as report 12M – Recommended Base Folders.

Now that we know who our owners are, the next step is to start getting them involved. My next few posts will cover exactly how we do this using both DatAdvantage and DataPrivilege.

Stay tuned!

Image credit: gorbould

Top 5 Things IT Should Be Doing, But Isn’t

A clear path to effective information governance.

1. Audit Data Access

Effective management of any data set is impossible without a record of access. Unless one can reliably observe data use, one cannot observe its non-use, misuse, or abuse. Without a record of data usage, one cannot answer critical questions—from the most basic ones, like “who deleted my files, what data does this person or people use, and what data isn’t used?” to more complex questions, “like who owns a data set, which data sets support this business unit, and how can I lock down data without disrupting workflows?”

2. Inventory Permissions and Directory Services Group Objects

Effective management of any data set is also impossible without understanding who has access to it. Access controls lists and groups (in Active Directory, LDAP, etc.) are the fundamental protective control mechanism for all unstructured and semi structured data platforms, yet too often IT cannot easily answer fundamental data protection questions like, “Who has access to a data set?” and “What data sets does a user or group have access to?” Answers to these questions must be accurate and accessible for data protection and management projects to succeed.

3. Prioritize Which Data Should Be Addressed

While all data should be protected, some data needs to be protected much more urgently than other data. Some data sets have well known owners and well defined processes and controls for their protection, but many others are less understood. With an audit trail, data classification technology, and access control information, organizations can identify active and stale data, data that is considered sensitive, confidential, or internal, and data that is accessible to many people. These data sets should be reviewed and addressed quickly to reduce risk.

Access our FREE Full Report, including the complete list of IT Must Do’s.

4. Remove Global Access Groups from ACLs (like “Everyone”) – especially where sensitive data is located

It is not uncommon for folders on file shares to have access control permissions allowing “Everyone,” or all “domain users” (nearly Everyone) to access the data contained therein. SharePoint has the same problem ( especially with authenticated users). Exchange has these, as well as “Anonymous User” access. This creates a significant security risk; for any data placed in that folder will inherit those “exposed” permissions, and those who place data in these wide-open folders may not be aware of the lax access settings. When sensitive data, like PII, credit card information, intellectual property, or HR information are in these folders, the risks can become very significant. Global access to folders, SharePoint sites, and mailboxes should be removed and replaced with rules that give access to the explicit groups that need it.

5. Identify Data Owners

IT should keep track of data business owners and the folders and SharePoint sites under their responsibility. By involving data owners, IT can expedite a number of the previously identified tasks, including verifying permissions revocation and review, and identifying data for archival. The net effect is a marked increase in the accuracy of data entitlement permissions and, therefore, data protection.

Access our FREE Full Report including the complete list of IT Must Do’s. 

Using Varonis: Why Data Owners?

(This one entry in a series of posts about the Varonis Operational Plan – a clear path to data governance.  You can find the whole series here.)

Data OwnersOne of my first jobs in IT was on the help desk for a medium-sized company. A big part of my job was provisioning access. If your company has shared data (and what organization doesn’t?), the words “I need access to this folder” are probably very familiar to you.

There are countless reasons for modifying access controls: new hires, consultants, role changes, temporary projects, cross-functional teams, terminations, department restructuring, M&A – the list goes on.  Coordinating who has access to which data has—detrimentally—became a core responsibility of IT.

Let’s peak inside a typical permissions conversation between an end-user and the help desk:

User (to the Help Desk): I need access to a folder in the S: drive, can you help?
Help Desk: Of course. Can you tell me which folder?
User: The folder is called FYQ3-docs. I need access for the next few weeks.
Help Desk: Do you know who manages the folder? To make this change we need an approval.
User: My boss asked me to get access. I can forward you the email?
Help Desk: Sure, that will be good enough.

Look familiar?

In some organizations, this process may be a little more complicated, a little more automated, or both, but in general the process follows this workflow: access is requested by a user, approved by that user’s manager, and provisioned by someone in IT.

That’s the way it’s been done for years, and it works great, right?  Well, not really.  This ostensibly innocent access provisioning workflow can be the seed for the most costly data breaches an organization will ever face.

The wrong people

In this example, the user’s manager is the one providing the approval. That person may not be, and in fact usually isn’t, the person who should be making this decision. The data itself is a business asset, so access to that data is a business decision. That means that the owner of that asset—i.e., the data owner—should be the one making the decision.

Imagine if access to a financial account worked the same way as access to a shared folder—managers would be able to get access for their team without the actual budget owner having any idea about it.  Madness!

Organizations that have an excellent grasp on data ownership and information governance have not only figured out a way to ensure approval is granted by the right person, but they’ve factored the help desk out of the equation completely, freeing up precious resources.

A recent article on the Harvard Business Review blog states:

Different kinds of assets, people, capital, technology, and data demand different kinds of management. You don’t manage people assets the same way you manage capital assets. Nor should you manage data assets in the same way you manage technology assets. This may be the most fundamental reason for moving responsibility for data out of IT.”

Let’s now re-envision the access provisioning scenario:

  • User fills out a web form describing which data she needs access to, why, and for how long.
  • Request gets automatically routed to the business person in the organization who is best equipped to approve the request – i.e., the data owner.
  • Data owner approves or denies the request by clicking a button.

Much better!  The access request is fulfilled by the correct person without involved the requestor’s manager or IT.

Easier said than done

The hard part here, and the reason things have traditionally worked this way, is that when it comes to shared data, we don’t have a good way of figuring out who the actual owner is. IT may have some idea based on group access—if there’s a single group that grants access to a folder, you may be able to figure out the director or manager of that group, for instance. But what happens if data is open to two or three different teams? What about data open to everyone? Identifying and aligning owners is extraordinarily difficult if you rely on traditional methods.

With Varonis, there’s a much better way. Because DatAdvantage is constantly gathering a complete audit record, we can use aggregate access activity to identify likely owners. If the three or four most active users of a folder all report to the same person, it’s highly likely that person is the true data owner. At worst, you’re one phone call away from knowing.

By identifying business owners of data, IT can take the first step toward shifting the burden to the teams who have the right context (and often authority) to be making decisions about access. One challenge with this approach is figuring out which folders actually need owners, something I’ll talk about in the next post.

Image credit: richard-g

Using Varonis: Fixing the Biggest Problems

(This one entry in a series of posts about the Varonis Operational Plan – a clear path to data governance.  You can find the whole series here.)

Now that we have a pretty good idea where the highest-risk data is, the question naturally turns to reducing that risk. Fixing permissions problems on Windows, SharePoint or Exchange has always been a significant operational challenge. I’ve been in plenty of situations as an admin where I know something is broken—a SharePoint site open to Authenticated Users for instance—but I’ve felt powerless to actually address the problem since any permissions change carries the risk of denying access to a user (or process) who needs it. Mistakes can have significant business impact depending on whose access you broke and on what data. Since we’re defining “at-risk” as being valuable data that’s over-exposed, that means that any accessibility problems we create will impact valuable data, and that can create more problems than we started with.

Step 3: Remediate High-Risk Data

The goal is to reduce risk by reducing permissions for those users or processes that don’t require access to the data in question.

The next step in the Varonis Operational Plan is fixing those high-risk access control issues that we’ve identified: data open to global access groups as well as concentrations of sensitive information open to either global groups or groups with many users. Since simply reducing access without any context can cause problems, we need to leverage metadata and automation through DatAdvantage.

Let’s tackle global access first. When everyone can access data, it’s very difficult to know who among the large set of potential users actually needs that access. If we know exactly who’s touching the data, we can be surgical about reducing access without causing any headaches.

DatAdvantage analyzes the data’s audit record over time in conjunction with access controls, showing folders, SharePoint sites, and other repositories that are accessible by global access groups, and those users who have been accessing that data who wouldn’t have had access without a global access group. In effect, it’s doing an environment-wide simulation to answer the question, “What if I removed every global access group off every ACL tomorrow. Who would be affected?” This report gives you some key information:

  • Which data is open to global access groups
  • Which part of that data is being accessed by users who wouldn’t otherwise be able to access

And it’s not just global groups that DatAdvantage lets you do this with. Because every data touch by every user on every monitored server is logged, Varonis lets you do this kind of analysis for any user, in any group, on any file or folder. That means you can safely remediate access to all of the high-risk data without risking productivity. You can actually fix the problem without getting in anyone’s way.

The next step is to start shifting decision making from your IT staff to the people who actually should be making choices about who gets access to data: data owners.

Image credit: harwichs