Guest Editorial: More Questions about Information Security Research
For only the second time since I’ve been blogging on this site, I feel compelled to publish the comments of someone who reacted to something I wrote. I know the author, Kevin Spease (the president of the ISSA-Sacramento Chapter), rather well, and his timely and poignant analysis of the growing controversy concerning “play for pay” research institutes could not be better written. Here goes–enjoy.
– – – – – – – – – – – – –
I’m the kind of guy that sniffs the milk in the carton before taking a big gulp. If it smells bad, I wash it down the drain – regardless of the expiration date. No doubt, I’ve dumped out more milk than I should have. For the most part, this method has served me well. I can’t recall the last time I’ve been surprised by the sour taste of curdled milk.
The other day, I was thinking about bad milk and how it reminded me of some recent research studies I’ve read. Obviously, this requires some explanation…
Without a doubt, the information security landscape is a target-rich research environment. We are hungry for answers to complex mathematical problems like finding an efficient method to factor large prime numbers. We need to better understand human-factor issues like the ability of a soldier to enter complex passwords under extreme physical stress in unfavorable environments. We also need answers to more mundane (but no less important) business concerns, such as the financial impact of unfavorable security events.
In the last few years, I’ve read some very interesting studies. I sheepishly admit most mathematical research flies over my head at 186,000 miles per second. However, I try to read and understand as much new research as possible. While I’ve found plenty of interesting research, I’ve been sadly disappointed by more than a few.
The simple reason is… I can’t trust them.
How do I know I cannot trust a study?
While crude, I use simple steps to determine the “trustworthiness” of a study.
Step One -Look for Bias: Look for a tendency or preference to a specific perspective or result, meaning, are there conditions that may motivate the study’s conclusions to be favorable toward a particular interest? Who are the study sponsors and do the conclusions directly support the sponsor’s cause? Studies cost money and sponsorship is a necessary part of research. However, if you get the feeling that a marketing department is ”throwing their voice,” they most likely are.
Step Two – Look for Repeatability: When analyzing the study, ask yourself if there is enough information for someone to validate the conclusions. If an independent and disinterested researcher with similar skills can sample the same population, perform the same analysis and arrive at a reasonably similar conclusion then you can safely assume the study is sound. I’m more likely to trust something I can personally validate, if so inclined.
Step Three – Look for Fallacy: Is there flawed reasoning? Do the conclusions overstate the importance of the findings or attempt to evoke emotion through hyperbole? If you get the feeling that someone is using the study to compel you to action then you should take the study with a large grain of salt.
The information security community has so many questions to answer and problems to solve. Developing security technologies, designing secure networks and making a business case for defending our enterprises requires new discoveries and dependable statistics. However, untrustworthy studies tainted with bias, conducted without discipline and riddled with fallacy cause more harm than they help.
As security professionals, we get enough criticism from our customers. At best, we are perceived as rigid and skeptical. At worst, we are accused of being feet-dragging fear mongers who fail to appreciate the negative impact of our continued waffling. The security of our customer’s information and, in turn, our professional reputations, hinge upon our ability to make sound decisions based on reliable data.
In all honesty, I cringe when I use the findings of some researchers to support my projects because I’m concerned the underlying work is fundamentally flawed (of course, I heavily caveat the research when I do use it). And to make matters worse, I see these findings quoted openly, as fact, in the media.
The next time you are reading a study, I challenge you to evaluate with a healthy level of skepticism. Approach the study as you would an unsolicited email from an unknown sender. Do some critical thinking before trusting it.
Like milk, I may wash a few too many studies down the drain. But, I’d rather do that than to develop new technology, make a business decision or advise a customer using bad milk.