Employees and other inside personnel pose among the largest risks to an organization. The overall profile of risk exists across a continuum that includes losses based on employee productivity, breach of confidentiality, data integrity, and overall availability of organizational resources. It is therefore essential that a firm develop and maintain an effective posture for mitigating these risks that are both compliant with statutory requirements and limitations, while balancing the rights of employees to maintain a modicum of privacy in and out of the workplace.
Posts, discussions and papers that may be interesting, informational or useful to others. Odds are that there will be random smatterings regarding my thoughts on the programs I'm involved in.
The Economics and Risks of Electronic Employee Monitoring
Outsourced Data Handling: Brief Thoughts of Negligent Entrustment Applicability
I find this subject area to be among the most interesting and fluid in IT today, as the area where law meets technology is still being formed in the US. Our system of common law combined with the speed of our legislative process (or lack thereof) compared to that of technological innovation leaves gaps in legal findings that are being tried on a regular basis to develop judicial precedence. Precedence then being the interpretation of a tort, as compared to what a judge may find similar in other cases when combined with the doctrine of Stare Decisis (until opposing legislation is enacted) covers these subjects more than the actual state of the congressional acts or agency regulation. It is an incredibly complicated subject where many educated in law are not technologists, and many technologists have an inadequate education in business law.
Negligent entrustment is covered under state civil codes within the personal injury set of torts (Kionka, 1999). What Rustad and Koenig(2007) detail are what might someday become issues of liability to a US company, not necessarily those that are currently primary liability concerns. It is still an interesting thought exercise and set of items that must be evaluated to form an effective risk profile of outsourcing activities. Also, since an individual is entitled to sue for anything the costs for an organization to protect itself from suit, even in those cases where it is settled or on, could be substantial. In order to reduce these eventualities the due care and due diligence of audit and contract enforcement, validation of contract performance measurements, and adherence to the law of the land the data originates, must be among the foremost concerns of the CIO, CISO and compliance officer. The ways in which courts rule on data in the coming years have the potential to affect the IT and software industry in the US. A finding in favor of suit for negligent entrustment in outsourced data would set dangerous precedent, deeming the data itself a dangerous item. This concept could then easily bleed into many other areas to include software development liability (which is cause for another paper entirely). While there have been attempts to prosecute authors of hacking tools with criminal offenses, to date, it has been upheld as activity covered under the 1st Amendment - but should that protection fall, all users and creators of data would have significant potential liabilities foisted upon them
As pointed out by Rustad and Koenig (2007) the incidence of lawsuits brought for negligent security practices is on the rise, but all cases that have resulted in award have been the result of direct liability. In the US, courts based on the research conducted and literature reviewed, there have yet to be any cases of indirect liability or negligent entrustment decided since negligence itself is specific to failure of due care (in absence of strict liability statutes) and there basis for negligent entrustment when the instrument itself is not directly capable of causing harm (Kionka, 1999). Additionally, when there is a superseding cause that would not have been reasonably foreseeable there would not be an issue of liability. If however, the outsourcing operations fail to include the due diligence and due care of a reasonable man, then the superseding cause argument would fail, and liability would revert to the company as the tortfeasor since reasonable expectation of data breach would likely meet the requirements for a proximate cause liability claim (Clarkson, Miller & Jentz, 2003).
Data breaches are inevitable (Huang, Hu & Behara, 2008), and the California Appellate court found that this claim of inevitability on the part of the claimant, or that evidence of that inevitability could be used to show that the negligence of the defendant, if any, is not a proximate cause on the part of the defendant (Smith v. San Francisco, 1953). While it remains essential to protect data under direct liability scenarios where a failure to exercise due care can be actionable, imagine a world where this was not the case, or where the ability to exclude warranty of merchantability and/or suitability for purpose could not be accomplished through contract. Oracle Corporation might be held liable because their software held a vulnerability that was exploited by a hacker attacking a bank, or Microsoft could be sued if a workstation crashed and lost some personal data. As highlighted by Ferrera, Lichtenstein, Reder, Bird and Schiano (2004) both of these situations can be shown to have actual damages, though the effect of allowing such inevitable actions to fall back to the progenitor of the system would have chilling repercussions for all transactions and systems in the digital world.
Ferrera, L. R. (2004). CyberLaw, Text and Cases 2nd Edition. Thomson Corporation.
Huang, C., Hu, Q., & Behara, R. S. (2008). An economic analysis of the optimal information security investment in the case of a risk-averse firm. International Journal of Production Economics, 114(2), 793–804. doi:10.1016/j.ijpe.2008.04.002
Kionka, Edward J. 1999. Torts in a Nutshell. 3d ed. St. Paul, Minn.: West Group
Rustad, M. L., & Koenig, T. H. (2007). Negligent entrustment liability for outsourced data. Journal of Internet Law, 10(10), 3–6. Retrieved from http://web.ebscohost.com.library.capella.edu/ehost/detail?sid=e04a605f-8b74-40ec-8586-a33effec288c@sessionmgr115&vid=1&hid=112&bdata=JnNpdGU9ZWhvc3QtbGl2ZSZzY29wZT1zaXRl#db=bth&AN=24619583
Smith v. San Francisco, 117 Cal. App. 2d 749, 256 P.2d 999 (1953)
Clarkson, K. W., Miller, R. L., & Jentz, G. A. (2003). West's Business Law Text and Cases, 9th . Thompson Learning.
Selection of VPN Solutions
Over the last decade (and specifically since the revision of the encryption export control regulations); there has been a considerable amount of research, development, and adoption of VPN technology sets used by not only business and government, but also in consumer markets. Indeed protection of data while in transit, and at rest, is a best practice in industry, a requirement in Government, and is finding use in a growing number of commercial applications. Given the wide array of available technology, the question of which type is best doesn’t really need to be asked. Use of techniques at the application layer such as SSL and SSH, technologies at the transport layer such as IPsec or PPTP, at the link level with L2TP, and while at rest using either hardware or software disk encryption can all be used together to ensure that data confidentiality and integrity is maintained throughout its lifecycle. Even with this complete stack, these protocol selections only touch on the various options, combinations and uses of encryption within each layer.
Given so many options, the selection of which technology to use – like many other things in IT must be based on the answer to “what is it you are trying to do”, “who is going to be using the system”, and “what are your requirements”. At the client end, the simplicity of implementation has become nearly trivial. Window 7, for example, provides native support for IKEv2, PPTP, L2TP/IPsec, and SSTP with a single drop-down selection. Authentication protocols for these can be matched with current infrastructure requirements, and data encryption algorithms can enforce data protection by encrypting within the established tunnel. With so many options, the classification of type will depend on what the requirement is. Should it be classified by authentication mechanism, by layer of operation, by supported encryption strength, or by the endpoints being connected? The short answer is ... it depends on what the VPN intends to accomplish.
At its most basic however, the VPN is designed to provide remote-connectivity from one system to another. This connectivity may be to support remote device (e.g. a travelling laptop) access to the corporate LAN, or to interconnect LANs to each other in a MAN/WAN scenario. The purpose of either approach, at its simplest is to provide secure connectivity between these systems for the purpose of protecting the confidentiality and integrity of data in transit. Given the vast array of options as endpoint devices, software methods, hybrid approaches and the various protection levels (and associated cost) that can be provided by each ... selection is in the eye of the beholder.
The Principle of Least Privilege
The materials on http://cccure.org, and specifically the guide by Rause & Tipton certainly supplemented the material by Shon Harris while I was studying for my CISSP. Even though I haven’t seen this page since sometime in 2007, a recent discussion thread re-linked it, as a starting point for a discussion on the principle of least privilege.
While the principle of least privilege can be briefly summarized and defined, the context in which it is evaluated through a given access control method is quite a different undertaking. The principle of least privilege can be succinctly defined as “limiting the access of authorized users to data they require to perform their duties” (Conrad et. al., 2010)(p. 47). All access not required would then be denied. This is slightly less restrictive than the need-to-know standard, which would be a subset of the authorizations offered in the least privilege approach. Both can be considered “deny by default” provisions where access must be specifically allocated by a person authorized to grant that access.
Authorization to grant access is the key delineator between the two primary access control methods of Mandatory Access Control and Discretionary Access Control. MAC restricts this authorization to an administrator of the data; owners and users are unable to provision or grant rights to other users. In the DAC model however, this privilege is granted to data owners and creators; effectively decentralizing the assignment of privilege to content owners.
Not well-covered in the linked article by Rause & Tipton is one additional model: Non-Discretionary Access Control. This access control methods typically though of when discussing NDAC are Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC). While many will argue that it belongs as a subcategory of MAC; it exists as a separate entity. Consider the following:
- “…RBAC is sometimes described as a form of MAC in the sense that users are unavoidably constrained by and have no influence over the enforcement of the organization’s protection policies. However, RBAC is different from TCSEC (Orange Book) MAC.” According to NIST, “RBAC is a separate and distinct model from MAC and DAC.” This is a frequently confused (and argued) point: non-discretionary access control is not the same as MAC.” (Conrad et. al., 2010)
- The NIST source cited by Conrad has moved, but is is available here:http://csrc.nist.gov/groups/SNS/rbac/
At this point: least privilege, need-to-know, and the models to support the assignment of authorization have been defined, the next area to tackle would be that of policy and supporting models. Supporting models are subcategorized into a matrix of security models which include: confidentiality, integrity, information flow, noninterference, take-grant, access-control matrices, Grahm-Denning, Harrison-Ruzzon-Ullman, Zachman Framework and Brewer-Nash models.
These models do have overlap. For example the Lattice-Based, State Machine, and Bell-LaPudula models are all confidentiality models, while the Biba and Clark-Wilson models are integrity models. Each of these, however, falls into the larger category of information flow models.
Avoiding the temptation to define and explain each of these, it will at this time have to be sufficient to say that these models are then supported by the system’s mode of operation. Modes of operation are codified within the Common Criteria adapted from DoD 5200.28 and provide modes for each of the MAC models for operation.
While these models themselves do little to protect organizational assets, the adherence to the governance policies they define (including both administrative and technical controls) provides the structural foundation to protect the confidentiality, integrity and availability of information.
Conrad, E., Misenar, S., & Feldman, J. (2010). CISSP Study Guide. Syngress: Burlington, MA.
ISO/IEC. (n.d.). Common Criteria Documents. Retrieved from NSA Common Criteria Evaluation Scheme:
Krause, M., & Tipton, H. (1997). CISSP Open Study Guides . Retrieved from Handbook of Information Security Management:
NIST. (n.d.). DoD 5200.28. Retrieved from Computer Security Research Center:
Term Paper Overload
Yes, it has been a while since I've published anything of substance - or uploaded new photos (to complete 2011 and begin 2012 albums) ... and really the best reason I have can be chalked up to "term paper overload". You see, I did start my Ph.D and for some unfathomable reason entered the term expecting a level of coursework similar to that of my Master's program. Let's just say that I should have listened to Tara when she said "I think it's going to be harder than that".
So, right now I've got three term papers due in draft form next week - ranging from 15 to 30 pages each. Add that to the regular research, discussion and everything else the program entails --- 3 classes = bad idea for a parent who works full time.
Guess I should get back to it. I just thought I'd post a quick note to family and friends who wonder why they never seem to hear from me anymore. I'll drop down to two classes next term, which shold be much more manageable & I'll have free time somewhere in the late 2014(ish) range assuming my dissertation goes well.