Five months ago I wrote in these pages a blog entitled “The Death of Risk.” It was a rant against the recent developments in banking which have featured fat cat bankers up to their ankles in the “moral hazard” miasma and happily getting their bailouts and bonuses from the overburdened taxpayers. The “mortgage meltdown” combined with the subsequent “credit crisis” happened mostly because a few bankers knew their firms were “too big to fail” and took advantage of this insight. I called it “the death of risk” because one of the things we all count on in information security is that the inadvisability of taking inappropriate risk is an almost universally accepted principle. It is not good to “bet the company” because that is usually an inappropriate risk. Some companies have gone out of business because their security was inadequate. But when Lehman Brothers or Bear Stearns went down, it was because somebody else bet the company and then got surprised when they didn’t get bailed out. Risk is everywhere, and coping with it is a major preoccupation of a legion of analysts from economists to CISSPs.
Now comes Donn Parker, one of the preeminent researchers and thinkers in information security, who may have taken the idea of the death of risk a little bit too far. Parker writes in a recent journal article,
“Think of security as a necessary overhead cost of doing business just as are facilities management, legal, audit, human resources, payroll, and accounting, and like the other overheads, it does not produce a return on investment (The return of savings from expenditures for security is unknown since the incidents that would have caused the savings did not occur.) I suggest that you gradually limit the extent of your risk assessments and reporting to meet only the minimum requirements of the law and regulations and remove the word “risk” from your writings, job titles, and job descriptions.”
[emphasis mine] Aaaaaaacckk! (that was me running screaming from the room!).
The article appeared in last month’s issue of The ISSA Journal, volume 8 – issue 7, page 12, entitled “Our Excessively Simplistic Information Security Model and How to Fix it.” The main point of the article is Parker’s proposal to expand our traditional “CIA” model (confidentiality, integrity and availability) to the new “Parkerian Hexad” in which he introduces three additional attributes or objectives of security, utility, authenticity and possession, explains why these three new attributes cover things that weren’t covered before and how they complete the model of information security. Parker also adds many new types of controls (yes, Virginia, prevention, detection and correction are not enough anymore). And he also proposes new objectives for information security, including: avoidance of negligence; an orderly & protected society; compliance with laws, regulations, and audits; ethical conduct; and successful commerce; and competition. The problem is, these new objectives replace risk reduction. Unfortunately, “removing risk” from the definitional model of information security is impossible and absurd and to advocate it throws our whole profession on its ear.
There are many things in Parker’s article that I agree with. The three new attributes he proposes can be shown to relate to “CIA” in a very complementary way. As the velocity of information through our systems and networks continues to accelerate, we will very likely see more instances in which a breach in possession occurs when confidentiality is unaffected; availability will be fine but utility will be in the impaired, and integrity will be intact but authenticity will be revealed to be bad. Parker’s point here is we have oversimplified the things we are striving for, and in so doing have missed important elements. However, Parker’s arguments against risk are themselves simplistic and rhetorically throw the baby out with the bathwater. He mostly rails against the calculation of aggregate risk for a company or organization and notes, quite correctly, that the actual risks from disparate and ostensibly independent threats may be in many cases highly correlated yet we have no idea exactly how they interrelate.
Parker correctly exposes the computation of aggregate risk for a company – or, for that matter the combination of precise risk results from uncharacteristic threats (say, the sum of the ALEs from airplane disasters and earthquakes) – as at best intellectual adventures or, worse, dangerous illusions. But “remove risk from your writings”? No way does that make sense.
In the late 1970s Alfred Kahn, an economist from Cornell University, once intemperately referred to a potential outcome of unwise policy as likely to lead to a recession, or a deep depression. White House advisors to President Carter objected and thereafter Kahn promised never to refer to a “recession” by name. He later jokingly said to reporters that while he could not comment on the likelihood of an economic downturn, he could talk about a “banana.” Later, he changed the fruit to a kumquat after a large banana company complained (I’m not making this up…). What are we to do without “risk” to talk about? The ideas of active acceptance of risk, residual risk, and inappropriate risk that we have worked so diligently to nurture, are good and we benefit from that common language. Risk is important…no, it is crucial. Without it we are simply mumbling about the abstruse. If I start talking about an orderly society and ethics, as Donn Parker argues in his article, I’ll use up all of my boardroom time slot explaining terms, then my controls proposal will get voted down because no one will have a clue what I’m talking about. In the movie “Ghostbusters” Egon describes the elevated amount of psychokentic energy in New York as a, “Twinkie 35 feet long weighing approximately 600 pounds.” When your latest pen test report discloses multiple severity one exposures and shocking unpatched vulnerabilities, instead of referring to this as inappropriate risk, you can quote Winston Zeddemore from the movie,
“That’s a big Twinkie.”
In part 2 of this blog, I’ll talk about why I think Risk has an undeservedly bad name, what problems have emerged by a careless use of risk terms and how to deal with all of this in a way that helps your program. If you are busy “removing risk” from your writings, at least turn “Track Changes” on so you can get it back later…