Monday, 1 December 2008

More data security and ineptitude

Somebody was telling me about the changes that were made on London Underground after the Kings Cross fire which killed 31 people in 1987. Very many changes were made and in total they all but removed the chance of a serious fire incident in sub-surface stations. One of them that interested me was about the fire systems. Back in the day, there were fire systems but they had to be started by a person. What if the person was absent or forgot or was unavailable for whatever reason? Obviously the system wouldn't operate. The Fennell report decided that a fire system can not rely on somebody to start it but should be automatic with the ability to override it if it was not required to go off. This simple but very perceptive statement identified the fact that people are not perfect, even the most able of us. It realised that however much people are trained, they sometimes choose to ignore training or they simply forget. The Kings Cross fire reminded us of that to fatal consequences.

How does this relate to data security? Well, losing thousands of people's data does not usually kill anybody but it would be wrong to say that it is unimportant, at the same time, if somebody loses data, are they realistically going to be put in prison or fined a massive amount. The simple reality is that so many people are working with data and probably either losing it or leaving it unsecured on a day to day basis that we cannot rely on training and processes and guidelines alone for our security. We must make the system robust enough so that security cannot be side-stepped or at least not without a very specific choice to do so which could then be punishable by more strict sanctions. It wouldn't take much (and has possibly already been done) to work out all of the places where data security is an issue and then enforce changes. Examples include mandatory locking screen savers at a desk, mandatory encryption on laptop hard-disks and removable drives. Locking of data so that it cannot be arbitrarily moved to pen drives etc (this can be hard but it is doable) and all of these done in a way that cannot be bypassed. You would then need a robust auditing system so that things done 'outside of the box' are recorded.

The biggest problem with this is really application support. When many people write web sites or databases, particularly for internal use, it is time-consuming and complex to lock data and pages down, to understand the myriad of security levels and protocols to find out what can be circumvented and what cannot. If the tools provided this out of the box (and there are some things along these lines like the owasp esapi) then our job would be much easier. Even on a basic level, the current voices in the media don't seem to realise that a password protected windows hard disk can be read directly from another operating system that does not choose to use the windows security system. If this is really the level we are at then government should bin any more data projects until people learn what they are doing. People might not die but it can be very, very annoying!

No comments: