Information Security Can Support Human Ecosystems
The Knights of Columbus and Marist organizations recently completed a survey of consumers and managers about ethics in today’s business world. http://news.yahoo.com/s/usnw/20090226/pl_usnw/majority_of_public_believes_corporate_america_needs_new_moral_direction. One of the striking things about the outcome of the survey is how much agreement there is between consumers and business managers about the lack of ethical standards behind today’s financial mess. This raises a fascinating topic of the ecology of human values within organizations and information security’s role in preserving and promoting that ecology.
Years ago I worked for an organization that pioneered aspects of modern information technology and also in the research of information security. One of the things that information security consultants were expected to do back then was to routinely mount “social engineering” attacks on client organizations in order to expose the likelihood that information might be shared inappropriately by members of that organization. Generally speaking, social engineering requires misrepresentation of the social engineer’s identity and role in order to trick the victim — in this case an unsuspecting employee of the organization — into revealing secrets. At our company however we defined a hard and fast rule that no consultant should ever be required to tell a lie as a normal or routine part of their day-to-day job. Lying is unethical. Even if done as a means to a worthy end. Many people observe religious values that prohibit them from telling a lie. Honesty is the underpinning of much of today’s common law and contract law. Although counterbalanced by the principle of “caveat emptor,” honesty still holds a centrally important role in virtually all we do within business today. But in the Knights of Columbus and Marist poll, consumers and business managers alike saw honesty and ethical behavior as lacking or at least declining in business organizations today.
When we abolished the lying as a job requirement, we had to put something else in its place. We developed a social engineering approach that did not require the consultant to lie but still obtained results that illuminated whether individuals within the organization might likely share confidential or sensitive information even as they tried to be helpful and professional to an outside person seeking information. We believed that the results of use of this tool were highly correlated with the results that would likely be obtained through a traditional social engineering exploit. Therefore, the same insight could be obtained through a process that did not require lying and did not have the unacceptable byproduct of tricking valuable employees into doing something they should not have done.
Broadly speaking, information security provides “rules and tools” to knowledge workers for the safe and appropriate use of the extremely valuable information and access they use every day in their jobs. Knowledge workers are very insightful about the effective implementation of information security controls that impact them. They know how controls work and how to get around them. Knowledge workers develop coping mechanisms such as the apocryphal password written on a Post-it note when information security controls exceed their ability as humans to cope. Effective information security controls recognize that human beings have limitations such as their ability to remember a complex password string or an encryption key. But information security also has the ability to support an ethical organizational culture as well.
Many info sec professionals spend most of their time thinking about “how to catch the bad guys.” However, what if we changed our perspective to “catching the good guys?” I maintain that information security has a role in assisting well-intentioned employees in meeting their ethical responsibilities and us in boosting the organizational ecosystem. We could achieve this by providing feedback to employees concerning their adherents to security requirements. For example, can your employees easily recount the number of hours in security awareness presentations they have participated? How about the number of acknowledgements of understanding they have signed as a part of a quasi-legalistic approach to establishing that they know the rules? Could we tell them the relative strength of the password they have selected? How about the number of times they’ve logged on successfully? Of course, it’s routine to report the last successful logon for this account but how about reporting on the number of probable unauthorized attempts at account access?
I once had a consulting client that emphasized safety as a primary value within the company. This is a company that had literally blown itself off the map two or three times in its several hundred year history. So safety was a value they cultivated as a matter of corporate strategy. In their safety program if someone found a wrench on the floor they picked it up and reported it to the safety officer and it was logged as a safety event. Even though no one had tripped or been injured as a result of the wrench on the floor, the organization defined the proactive identification of safety threats as a primary indicator of fewer safety problems later. They simply ignored the implicit moral hazard contained within most defect reporting models. If I report a defect, doesn’t that mean someone else made a mistake? Won’t my defect report turn into a disciplinary finding against my coworker? Well, human beings make mistakes. And if we want human beings to be more perfect in the execution of their jobs, one of the things we have to do is “increase the mistake rate” in an orderly fashion. That usually means in the presence of an adequate defect discovery and feedback mechanism.
Many organizations have a requirement for badge display by employees and visitors. But who actually looks at those badges? After implementing an unpopular new badge display rule at one client, two members of our consulting staff walked about in parts of the client organization where we were not known displaying a badge with an obviously incorrect picture or name. We gave $20 to each individual who challenged us based on looking at the badge and perceiving the apparent masquerade. A few hundred dollars spent on a highly valuable program that rewarded good guys making the right choice and generated some fun was money well spent. At the same company and individual saw a backhoe digging in the parking lot of a fast food restaurant about a mile away. This individual was a network engineer who had detailed knowledge of the location of fiber-optic cable routes serving the company’s main campus. The individual challenged the foreman of the digging crew with the protest that he believed the backhoe was dangerously close to fiber-optic cable. It turned out that the backhoe was only 18 inches away from breaking a fiber-optic cable that would have resulted in approximately $60,000 per minute revenue cost to the company (back then that was real money!). We recommended this employee be given a nice cash bonus for carrying through with a difficult and possibly embarrassing effort in a very professional way that saved the company a lot of money.
I argue that preserving the human ecosystem by catching the good guys doing the right thing is just as important — if not more important — than the usual police work of catching the bad guy. If knowledge workers can be made to see that doing the right thing is not merely taken for granted but is an important corporate value, this could pay dividends in many aspects of corporate performance. If we can identify metrics for the effective functioning of security controls, we can develop metrics that support the positive and constructive adherents to good security policy. Making such information available to employees, managers, and external stakeholders could be vital to establishing the company’s commitment to the appropriate acceptance of risk through the operation of an effective system of controls.