London Hospitals: A Failure to Exercise Due Dilligence?
Not too long ago a Web posting described how Whipps Cross University Hospital NHS Trust in London had experienced numerous Conficker worm infections in its Windows systems. A spokesperson for this hospital tried to do damage control by stating that a mere five percent of the systems had been infected and also that the compromised systems were only administrative systems, not medical systems. Additionally, the posting mentioned that several additional hospitals in the London area had suffered the same fate.
Somehow I was not shocked by what I read. Although I have not had extensive experience working with organizations in the health care arena, the experience I have had has taught me that things are not as they should be when it comes to the practice of information security in this arena. One would think that a combination of pressures—the need to avoid security-related mishaps that might endanger patients and HIPAA compliance pressures (in the US)—would serve as strong motivation for health care organizations to achieve very high levels of security risk mitigation. But alas, this tends not to be true. Instead, cost-conscious, profit hungry hospitals and clinics too often tend to do the bare minimum in securing their computers and networks. A good example is Whipps Cross University Hospital. Patch management is one of the cornerstones of an effective information security program, yet unbelievably this hospital by all appearances did not patch the Server service vulnerability described in Microsoft Bulletin MS08-067,.even though the patch has been available since October 2008. If this hospital cannot even patch a nearly one year old critical vulnerability, what else is it failing to patch? What other critical elements of its information security practice are missing or incomplete? And perhaps most importantly, would a potential patient of this hospital really want to receive treatment at a place where information security is not taken more seriously?
I made some comments about the Whipps Cross University Hospital worm infections in a recent SANS NewsBites issue. One reader, Jay Libove of Spain, sent me the following message:
I want to respond to a comment you made in a recent SANS newsletter
(NewsBites, I think) in which – at least as I read it – you implied that
significant computer security failures in the ambit of a hospital might be taken
to suggest significant medical failings in the healthcare services of the
hospital. I propose that this is unlikely to be true, and perhaps even
irresponsibly alarmist. I justify my position in part from my experience in
the airline business. In the airlines, safety of flight is heavily regulated by
government agencies, and of course the cost of a significant safety failure is
extraordinarily high – so high as to make computer security failures generally
pale. Something similar is likely true of any regulated industry, such as
As the convergence of technology with, well, nearly everything else, continues
salient to the hospital example being the computerization of healthcare
records, then hopefully we will see comparable regulation over the formerly-non-core IT areas of these otherwise regulated industries. Indeed, with the worldwide headlong rush in to e.g. computerization of healthcare records, we are just now entering a phase where your warning may soon become quite valid … but historically I argue that it is alarmist, and may do our cause more harm than good.
I am publishing Jay’s message because his counterarguments are so well expressed. What about regulation–does it really make that much of a difference with respect to security in the health care arena? These requirements are likely to result in the introduction or bolstering of security controls, true, but regulatory requirements are seldom very strict due to pressure from those potentially affected by them (and often also by their Congressional lobbyists). Additionally, enforcement of compliance with these requirements is often far from what it needs to be. So although regulatory requirements in the health care and other arenas do make a difference, they generally in and of themselves do not provide sufficient pressure to force organizations to allocate the level of resources needed to have truly effective security risk mitigation.
Talk of alarmism seems a bit overblown, but then again, Jay may be correct. It seems to me, however, that keeping pressure on organizations that by all appearance are deficient in major areas of their information security program is potentially very productive. Look at Microsoft back in the late 1990s and early this decade. Huge numbers of vulnerabilities in Microsoft products were found; instead of doing everything it could to fix these flaws, Microsoft chose to counterattack individuals who were critical of Microsoft’s strategy (or the lack thereof) by labeling them “alarmists.” Even a casual observer would have to admit that all the pressure on Microsoft caused a huge change in the way Microsoft developed its code—a huge change for the good. So I view charges of alarmism with respect to faulty security practices and security-flawed products with quite a bit of skepticism.
Would I stay in Whipps Cross University Hospital? No way! This hospital may have the best physicians in the world. (I did not in my NewsBites comments mean to imply anything to the contrary.) But just like anything else, state-of-the-art services depend on computers. If the computers malfunction due to security or other reasons, the skill of the physicians can easily become a moot point. So—Whipps Cross University Hospital and other London-area hospitals—it is time to get your act together when it comes to security. Once you do, please count me in as one of your potential patients!
Oh, and by the way, Jay is looking for a job. ..