Archive for: November, 2011

Our Top Predictions for 2012

It’s that time of year again—reruns of It’s a Wonderful Life (or The Lord of the Rings), comfy chairs in front of a blazing fire, libations and cheer, and when we start to consider what’s around the corner for us next year. This time we’re avoiding the long shopping list of predictions for a few pithy ones that you might actually remember. (We’re sure you haven’t forgotten our list of forty things that 2011 would bring and we’re not going to remind you because of course they all happened).

Secure Collaboration will go viral

Data will continue to grow at 50% year over year and digital collaboration will continue to be the core of every business process. 2012 will be the year data owners get involved – they will take back access control decisions from IT, and demand automation to analyze data, make better decisions, and eliminate costly, ineffective manual processes.

Organizations will realize that continuing on the current path will have devastating results for their businesses. Infrastructure must be refreshed periodically (convergence is the trend—small, distributed servers are consolidated into fewer, larger, centralized ones), and organizations will soon realize that without intelligence, moving data quickly and safely is difficult or impossible. The wrong data will be moved, the wrong people will have access, and data owners will be needed. Every decision about managing and protecting data will be difficult without adequate context and automation, and the right people will be needed to make these decisions.

We regularly work with organizations with 10-100,000 plus shares for which they need to identify data owners. In 2012 organizations are going to find it even harder to identify owners, and to make any data management and protection decisions without harnessing the power of metadata. If they don’t know who this “stuff” belongs to it’s just going to grow indefinitely, and will be perpetually at risk. It’s dawning on them that they can’t afford it.

Big Data analytics will expand its focus to the biggest data of all—unstructured information sitting on file servers, NAS devices, and in email systems

Effective data governance requires harnessing the power of metadata through intelligent automation. It is not surprising that industry experts are now saying that the same kind of automation is necessary for more than good governance.

In order to harness the power of “Big Data,” you’ll need to analyze and look for patterns in how and when these massive amounts of data are used, who uses it, in what sequence, and what it contains in order to effectively run a data-driven organization. Widely known fact: the majority of big data in the enterprise is unstructured versus structured.

Organizations will start keeping track of their assets through automation and we will see some IT departments taking drastic measures, such as shutting down “at risk” servers or access to e-mail if the proper audit trails are not in place

In a recent high profile case, one organization used our software to catch an infiltrator who was operating as a contractor within their firewall. This had enormous implications for the IT security of that organization. If they had not found the suspected hacker when they did who knows what damage could have been done. This individual is now out of their system— and in the justice system.

2012 may be the year servers get shut down and email withdrawn if there is no audit trail. If you couldn’t audit your bank account, you’d want to freeze it until you could—it’s now the same with data.  We will see some IT departments take drastic measures, such as shutting down unaudited, “at risk,” servers or access to e-mail if the proper audit trails are not in place. Organizations will start keeping better track of their digital assets through automation

Internal threats will still be a major worry for corporates in 2012 despite the demise of WikiLeaks

In many of the security breaches in 2011, employees or contractors were able to delete or download thousands of files without raising concerns because often no one was able to determine what sensitive data they had access to and secure it before information could be stolen, view an audit trail of what they actually did access after the fact, and certainly not hear any alarms go off while the breach was in progress, when access activity was unusual.

Much of the data accessed and leaked in recent breaches was composed of unstructured or semi-structured data – documents, spreadsheets, images, presentations, video and more – that resided on file shares accessible throughout organizations.

Download the full Varonis Top Predictions for Data Governance in 2012 White Paper here

Authorized Access – Understanding how US laws affect your authorization p...

In 1986, the United States Congress passed the Computer Fraud and Abuse Act (CFAA).  While the intent of these laws were originally to protect government computers and information from hackers, the laws have been applied to commercial interests, as well. Specifically, the Computer Fraud and Abuse Act subjects punishment to anyone who “knowingly and with intent to defraud, accesses a protected computer without authorization, or exceeds authorized access, and by means of such conduct furthers the intended fraud and obtains anything of value.”  While it is not our position to advise clients on this topic, it is important to understand how the US Courts interpret the phrase “authorized access,” and “exceeds authorized access.”

Through litigation, the US legal system has attempted to interpret the CFAA and determine the legal definition of “authorized access” and “exceeds authorized access.” Before getting into the value of Varonis features, it is essential to review the prevailing case law and judicial opinions about this topic.  While there have been a number of cases addressing this issue, there are two cases and an opinion by a US District Court that stand out, each of which provides a basis for current legal decisions that address authorization issues.  Not surprisingly, most available case law involves data “theft” by individuals who, at some level, had permission to access the information that they accessed.  For example:

  • USA v Nosal – In this case Nosal (a former employee of Korn/Ferry) obtained proprietary information from his former co-workers which he used to start a competing business. The former co-workers had authorization to access the information via the access permissions provided to them by Korn/Ferry, but the courts challenged whether they “Exceeded Authorized Access” because they signed a Non-Disclosure Agreement as well as an Acceptable Use Policy.
  • LVRC Holdings LLV v. Brekka – Brekka (an employee of LVRC) emailed business documents to his and his wife’s personal email accounts. Brekka had permission to access the business documents and LVRC did not have an acceptable use policy, so Brekka did not violate any access restrictions and ultimately maintained “Authorized Access.”
  • The United States Seventh Circuit District Court has stated that “an employee accesses a computer without authorization the moment the employee uses a computer or information on a computer in a manner adverse to the employer’s interest.”   This opinion stated that access permissions were only one factor in determining authorized access.  In this case, the access permissions available to the employee were considered, as well as whether the employee used these permissions and data in a manner which was detrimental to his employer’s interests. In other words, regardless of the permissions available to an employee, a “disloyal” employee may be guilty by accessing information available to them with ill-intent.  Other courts have offered differing opinions about this specific issue, creating additional confusion.

As you can see, the ability to determine what constitutes authorized access is still subject to interpretation in the courts. Acceptable Use policies and Non-Disclosure Agreements are important, but they are only useful after an incident has taken place.  Written policies and expectations of loyalty don’t safeguard important data and they don’t prevent disloyal employees from using data to their advantage.  Ultimately, IT Administrators must enforce rightful access via best practices–data owner involvement in authorization processes in conjunction with an audit trail to validate acceptable use. In other words, access should be granted purposefully and periodically reviewed.

Varonis products provide the following features which will help to address the legal issues identified above:

  • Complete visibility into the permissions that each individual has across Windows, Unix, Linux, SharePoint and Exchange environments
  • A full audit trail which demonstrates whether an employee has accessed data that an employer would consider important or inappropriate
  • The ability to ensure rightful access, involving data owners in the decision making process
  • The ability to determine the sensitivity of data, as defined by data owners
  • A provisioning system complete with an audit trail which can report on why a person was granted access to a resource, when, and by whom
  • Automated entitlement reviews to ensure that permissions are always appropriate

Moral of the story: Make every effort to ensure and validate rightful access so that you can peacefully co-exist with the vagaries of the law. Varonis products can ensure ongoing authorized access and provide information to support a claim that a person exceeded their authorized access.

Hampton Products

“We have a level of confidence now that we didn’t have before Varonis. DatAdvantage helps us simplify security and know with certainty that files and information at risk for overly permissive access are locked down.”

That quote’s from Brian Millsap, CIO and Vice President at Hampton Products, who announced today that they’ve successfully implemented Varonis DatAdvantage to clean up and maintain access to sensitive data within the organization. Hampton is a leading manufacturer of padlocks and other portable security hardware and like a lot of us these days was faced with a tough problem: lots of sensitive information on unstructured data stores with broken access controls. With an IT staff of nine people, they also didn’t have a lot of free resources to throw at the problem.

Being able to do more with less when you’re fixing permissions is key—it’s too much to try and do it all manually. Even if you have a good way of finding all the important data that needs protecting—no small feat—figuring out who actually needs access to each and every folder is a monumental task. Automation is key, and it’s one of the main reasons Hampton Products chose Varonis.


Data Authorization Processes – A need to relive the past

In 1941, the accounting governance body, the American Institute of Certified Public Accountants (AICPA) overhauled their Rules of Professional Conduct.  Rule 16 stated “A member or an associate shall not violate the confidential relationship between himself and his client.”  This provision was developed to guide Accountants (Data Stewards) and to reassure their customers (Data Owners) of the confidentiality of business and personal information.  Ironically enough, prior to this time, the AICPA, which has been in existence since the 1800’s, felt that a provision like this was unnecessary as they felt that  “The man with a loose tongue, the man who cannot keep a secret, should never attempt to practice public accounting.”* Prior to this change, the AICPA believed that an Accountant would never risk their professional career by revealing confidential information to a third party.

From the late 1800’s through the mid-1980’s, the manner Accountants stored financial information and the processes they used to manage it supported information confidentiality—usually a simple wood filing cabinet and keys.  Authentication and Authorization protections were simple:  If you didn’t have the key to the office you couldn’t get the filing cabinet.  If you didn’t have the key to the filing cabinet, you couldn’t open it or access the information within.  The key to the office was only given to select employees and associates authorized by the Accountant (once again, the Data Steward) to have access to the information.  If client information was revealed to a competitor, it was fairly easy to determine who leaked it.  In this regard, as early as the 1900’s the premise of least privilege existed and both the data owner and data steward had control and visibility into data authorization process.

Unfortunately, the paradigm of data protection has changed, and not in a positive way. Financial information is no longer controlled by a process with a clearly identified data owner and data steward.  Most companies have not identified data owners, most companies don’t have appropriate controls over their data, and most companies cannot exercise the same level of data owner involvement in access control decisions that existed in the early and mid-20th Century Accounting.  And, if a data owner has not been identified risk is extremely difficult to quantify, appropriate controls are difficult to implement and enforce, and customers will eventually lose faith in the ability for a supplier to protect personal and business data.

Electronic record keeping in conjunction with digital collaboration has overloaded manual authentication and authorization processes, even for those data sets that do have owners.  Automation is now necessary to achieve the level of data protection that Accountants used in 1900’s, where users are authenticated, owners are identified, and participate in the authorization process armed with the intelligence to make good decisions.  No one can dispute the many benefits of the electronic recordkeeping.  However, as we approach the end of the fiscal year, while many companies are doing tax planning and budgeting for 2011 and 2012, we should all be conscious of the steps our IT suppliers are taking to protect our business and personal data—hopefully they’re using more than a bigger file cabinet.

*Carey, John L. “Professional Ethics of Public Accounting” New York: American Institute of Accountants, 1946.

Open Shares

In my post last week, Share Permissions, I promised I’d write a follow up post on “open shares.” Open shares, in a nutshell, are folders that are accessible to all (or pretty much all) of the people on the network. In the Windows world, these are folders are that are shared over the network via CIFS, and accessible to what are called “global access groups,” like Everyone, Domain Users, and Authenticated Users.

In order for a folder to be accessible to a global access group, its NTFS permissions must be set to be accessible by the group, and the folder must be shared or reside within the hierarchy of a share whose permissions are also accessible to the global access group.  For example, for a folder to be accessible, or open, to the Everyone group, the Everyone group must be on its access control list (ACL) with some level of access, and the folder and/or one of its parents must be shared so that Everyone has some level of share permissions. (See Share Permissions for an explanation of how sharing permissions work).

There are many possible combinations that can provide such open access—Everyone may be on the NTFS permissions while Authenticated or Domain Users have share access, Authenticated Users may be a child of another group that has either NTFS or share access, etc. No matter what the combination, the end result is that just about everyone in the organization has access to the data that resides in the folder, and the vast majority of the time that’s bad. To put it simply:

Open Shares = Bad

Unfortunately, organizations usually have lots of open shares on their servers and NAS devices, and often quite a few contain sensitive data. Using the native tools provided with Windows these shares are very difficult to find and even harder to fix. Once remediated, it’s also difficult to make sure these folders continue to stay locked down and new, insecure folders aren’t created.

The good news is that metadata framework technology now exists to identify and remediate open shares, prioritize which ones to remediate first based on exposure, content and activity, and make sure that no one who has a legitimate need for access gets cut off. Once open shares are eliminated, a metadata framework can automatically detect a relapse as well as any newly created open shares.