There was an unusually frank admission in the Register’s recent report (http://tinyurl.com/3k3j3xv) on their home-grown email SNAFU. Roughly 3000 people received an email containing the names and email addresses of roughly 45000 people because, in the Register’s words “The two-stage send process that is the norm for all of our mailers was over-looked because someone was in a hurry”.
Shortly after I took delivery of my iPhone, I checked my emails at the end of a training course. A delegate, hanging around to ask me the obligatory ‘how do I sort out my credit’ question, noticed it and immediately told me what to do when I dropped it down the toilet. Leave it submerged in a jar of sugar for a week, apparently – not salt, which supposedly does not absorb the moisture, but replaces it with crystals instead, killing the little guy once and for all.
I have been given variations on this tip four times since – and always with the certainty that this is a ‘when’ not ‘if’ scenario. One of my punters recently explained that the iPhone in the toilet is an occupational hazard for some women, as they secure their phones in their bra-strap when out on the razz, and when they lean forward to flush, it slips out into the bowl. I once saw a man in the Gents at my local cinema conducting an animated conversation, one hand on his phone, the other dealing with urinal business. The admirable comedian Michael Legge (@michaellegge) has a similar anecdote, except his story ends with the person on the phone complaining when Legge uses the hand drier (“I’m on the phone” the caller remonstrates).
I have many things to say about information security, but what I am getting at here is the foundation of all data loss prevention, security and risk management: people can be idiots. Give them enough space, and there is no end to the daft things that they will do. It doesn’t matter how clever they are: witness former Assistant Commissioner Bob Quick unwittingly parading anti-terror documents in front of the photographers at Downing Street, Oliver Letwin’s recent escapades in St James’ Park, or Chris Huhne telling a colleague that he doesn’t want his fingerprints on a leak, but doing it via a public tweet instead of a direct message. All clever and talented men, all channelling Frank Spencer (ask your Dad). It doesn’t matter how obvious the risk is – a friend rang me a while back to ask me what to do about the fact his employer has just discovered that one of their unencrypted laptops has been left on a bus. I asked him why they hadn’t encrypted their laptops, and apparently a senior officer had insisted that they could rely on the professionalism of their staff. Which is rather like the captain of the Titanic saying: ‘No, on second thoughts, let’s just ram it. What’s the worst that could happen?’
I’m helping an organisation to update their information security policy at the moment. And the problem is that for all the practical and intelligent elements that we’re going to include, I just keep thinking “and what are the idiots going to do?”. Most people, most of the time, don’t really need to be told what to do with data – they’ll guard it with their lives, keep it accurate and up-to-date and make it accessible only to those who need it because they’ve got common sense. But even the best member of staff goes idiot when they’re in a hurry, or they’ve under pressure, or they don’t understand the technology. And then there are a very small number who are idiots all day, every day. Look around the table, and if you don’t see the idiot, it’s you. And it’s also the man walking behind you using a closed laptop as a tray for four cups of coffee.
In these days of £120000 fines, idiot identification and prevention is vital. If you can take steps like encryption, access control and the ritual burning of fax machines, the idiot’s room for manoeuvre is restricted and the risk is greatly reduced. If you can prove that you have the policies, did the training, and have the idiot’s signature proving they knew what not to do, and did it anyway, even when things go wrong, the organisation can at least prove that it acted correctly and can probably dodge a bullet. If you’re smart, the vast majority who are sensible and reliable may not even feel the constraints, even as they work within them.
Dealing with security breaches is increasingly going to be an exercise in idiot-hunting. The Register’s apparent breach is not that serious (they haven’t lost sensitive or financial data, for one thing), but the ‘someone’ cited in their announcement is pivotal. If The Register can prove that they had proper policies and had clearly communicated them, they haven’t breached the Data Protection Act (even if the ICO gets them to sign a completely unnecessary undertaking, which it probably will). However, if ‘someone’ can argue that they hadn’t been informed of the right practices, there’s a possibility for legitimate ICO action. As a trainer, you’d expect me to tell you that you need to train people. You need to take away options. You need to communicate your messages clearly and to everyone.
But fundamentally, every organisation will continue to employ a few people with a deep capacity for foolishness, and you need to put your mind to work to spot what they’ll come up with next. It’ll probably involve social media, but that’s another blog post.