Risky business

The security expert Bruce Schneier begins an article about how people perceive risk, which appeared in yesterday's Guardian, with the following vignette:

People have a natural intuition about risk, and in many ways it's very good. It fails at times due to a variety of cognitive biases, but for normal risks that people regularly encounter, it works surprisingly well: often better than we give it credit for.

This struck me as I listened to yet another conference presenter complaining about security awareness training. He was talking about the difficulty of getting employees at his company to actually follow his security policies: encrypting data on memory sticks, not sharing passwords, not logging in from untrusted wireless networks. "We have to make people understand the risks," he said.

It seems to me that his co-workers understand the risks better than he does. They know what the real risks are at work, and that they all revolve around not getting the job done. Those risks are real and tangible, and employees feel them all the time. The risks of not following security procedures are much less real.


Schneier doesn't quite say it, but the security consultant and the employees are thinking about two different risks: on the one hand the risk of something bad happening to the data, on the other the risk of the particular employee getting into trouble as a result. The larger the organisation, the more opportunities there are for employees to lose data when "proper procedures" have not been followed. This, of course, explains why government departments are especially prone to losing people's private data. Most civil servants follow procedures most of the time - and even when they don't, most of the time the data will not in fact be compromised. But no system can be completely foolproof, and almost regardless of what safeguards are put in place someone, somewhere will make a mistake.

According to Schneier, the solution is to increase the pressure on staff by summarily dismissing anyone found making even the smallest mistake in security procedure. That way the risk to the employee or civil servant personally of losing their job will loom large in their minds. Without such threats, the temptation is to cut corners - thus avoiding the other risk of being disciplined for not getting the job done on time, quite apart from the natural human desire to do as little as possible.

As Schneier points out, it is quite rational for an employee to reason thus, just as most sensible, most of the time, don't take too much notice of superfluous safety notices - except, perhaps, to notice how superfuous they are. But at that point his thoughts and mine take different paths, because he implies that clamping down on employees who don't follow the correct security procedures is the most efficient strategy. From the point of view of a security consultant, no doubt it is. But clamping down on security breaches, where they are detected, will not prevent every violation. What it will do is reduce overall efficiency. Non-security related tasks will be accorded less priority, and suffer as a result. Mistakes will be made. Costs will rise. Productivity will fall.

Seen from the wider perspective, the risk of a security breach is simply one risk among many. Tightening security procedures, meanwhile, runs the risk of reduced efficiency, the consequences of which, on the whole, might well be worse. A more effective strategy would be to divide the data into many discrete packets, thus ensuring that breaches, when they occur, could be contained.

Lurking behind this problem is the little-regarded fact that the word "risk" has at least two quite different meanings. A "risk" is a danger - a crumbling ledge is a risk in this sense, because anyone stepping onto it is likely to have a nasty accident. But "risk" is also the odds that something bad might happen. Even generally safe things - ladders, packets of nuts, school playgrounds and the like - are "risks" in the sense that there is a quantifiable possibility that they will be the cause of an accident. But merely to utter the word "risk" is to bring to mind the inherently dangerous "risks" of the first sort. The actual probabilities of the bad thing happening then fade into the background, because once something has been identified as a potential risk attention focus naturally on the catastrophe that might possibly happen (and thus on how to prevent it) rather than on the odds of it happening at all, which may be extremely low.

The problem is worse, of course, when the catastrophe is both rare and newsworthy, such as a terrorist atrocity or the murder of two schoolgirls by a caretaker. "Never again!" cry the newspapers, and in response governments put in place procedures that are costly, over-elaborate and wholly out of proportion to the real risk. Guarding against such improbabilities is a little like buying a lottery ticket; mostly it's a waste of money, and while buying more tickets notionally increases your chance of winning, it inevitably and unavoidably increases your expenditure. Money can only be spent once: money spent on security measures will be taken away from other, more immediately relevant things, or it will add to costs that must eventually be paid by someone. And that someone will be the taxpayer, or the customer, or both. Over-awareness of risk stifles creativity, dulling economic growth. Over-protected children may suffer in terms of emotional and social development, and be at greater risk in later life as a consequence. Valuable things will not be done because they are considered unacceptably hazardous - and the loss will never be quantifiable, because we can never know what lost opportunities might have produced.

Risk-avoidance, in other words, is a risky business.

Comments

Popular Posts