How to deal with emergencies better

October 28th, 2013 at 12:38 UTC by Ross Anderson

Britain has just been hit by a storm; two people have been killed by falling trees, and one swept out to sea. The rail network is in chaos and over 100,000 homes lost electric power. What can security engineering teach about such events?

Risk communication could be very much better. The storm had been forecast for several days but the instructions and advice from authority have almost all been framed in vague and general terms. Our research on browser warnings shows that people mostly ignore vague warnings (“Warning – visiting this web site may harm your computer!”) but pay much more attention to concrete ones (such as “The site you are about to visit has been confirmed to contain software that poses a significant risk to you, with no tangible benefit. It would try to infect your computer with malware designed to steal your bank account and credit card details in order to defraud you”). In fact, making warnings more concrete is the only thing that works here – nudge favourites such as appealing to social norms, or authority, or even putting a cartoon face on the page to activate social cognition, don’t seem to have a significant effect in this context.

So how should the Met Office and the emergency services deal with the next storm?

While driving to work I heard a council official telling people not to take their own saws to fallen tree branches, but wait for council crews. A left-leaning listener might interpret this as a lawyerly “Right, I’ve covered by backside by saying that” while a conservative-leaning one might hear a trade unionist line “Don’t you dare take bread out of the mouths of the workers!” Government spokespersons score pretty low on most trust scales and people tend to project on them the lowest motives of whichever party they support least. It would surely have been better to say “If a road is blocked by a tree, just call us and we’ll send a crew round. If you absolutely can’t wait, take care! There are accidents every year when someone cuts halfway through a branch, and it cracks and the rest of the tree falls on them.”

Similarly, in the run-up to the storm, the weather forecaster might usefully say “We had three people unfortunately killed last time in 2013, and 30 killed in the big storm of 1987. Most fatal injuries are from falling trees, then flying debris, then people being washed out to sea. So stay at home if you can. If you really must go out, keep your eyes open, so you can duck if something is blown your way. And don’t stand right on the seafront to admire the waves. Twenty-foot waves can be awesome, but every so often a forty-foot one comes along. So keep your distance.”

A useful way of thinking about it might be this: what advice would you yourself heed if it came from the politician you trust the least? You won’t buy any of his argument, but you may well accept a reminder of a fact that you knew already.

Entry filed under: Academic papers, Legal issues, News coverage, Politics, Security psychology

6 comments Add your own

  • 1. Michael  |  October 30th, 2013 at 18:27 UTC

    I find this post wonderful. I can’t believe how simple, but obvious it is. Why the government (I can only speak for US government) doesn’t do this is beyond me. I know they are worried about being sued, but the communication is much more important. Thank you Dr. Ross.

  • 2. Mate Soos  |  October 30th, 2013 at 20:37 UTC

    Why the government doesn’t do it is probably twofold. First, they lack the expertise. Second, they are afraid to cry wolf too often. Most problems can be avoided with expertise but that costs money, time and requires someone to acknowledge that they might not be good at the task at hand.

    Reminds me of the news about the Toyota ECM review PDF (trial hearing transcript) that is in the news. Expertise would have prevented a death and the PR nightmare that ensued. Expertise was available and not overly expensive relative to the risk. Yet it was not employed, probably to save face rather than anything else.

  • 3. Ross Anderson  |  October 30th, 2013 at 20:45 UTC

    Matt

    You may well have a point. Some parts of government understand very well how to grab hold of people’s risk perception; think of the child protection industry, or the terror-industrial complex. Maybe there’s just not enough money in getting people scared about the weather.

  • 4. Chris  |  November 1st, 2013 at 12:02 UTC

    It’s interesting to contrast how perceptions of the risk affect the way warnings, vague or otherwise, are received. Storms can be intense but in the UK I’d warrant they’re not perceived as much more than a very blustery wet day, so what turns out to be simple but important precautions are not really taken seriously, with some fatal consequences.

    Contrast with perceptions of the risk of “terror related incidents”, a vague catch-all phrase in its own right. These are quite rare in the UK but are perceived as being a huge risk, with disproportionate tradeoffs in liberty to feel prepared for them.

    With the former, thinking about a UK storm does not conjure up a dramatic picture. But with the latter, people’s imagination does most of the work and latches onto dramatic, graphic images of 9/11 and soldier Lee Rigby, for example, and extrapolates this in all kinds of ways.

    In otherwords vagueness of the warning alone does not seem to be the full picture. Vagueness of the warning and internal representation of the consequences both have a part to play, and in some cases, such as “terror”, the latter can swamp the former. Terror risks are extremely vague but everyone is on their toes looking over their shoulder because they (wrongly) have a very clear, graphic, sustaining picture in their heads.

    This seems to suggest that in order to deal with emergencies better the consequences can be spelled out, but only up to a point after which the event may just not grab people’s imaginations enough no matter how direct the warnings. Perhaps in those cases some graphic images or reconstructoins could help.

    Similarly with people falling to malware, they may have all the specific warnings they can take but they just don’t perceive there to be a real risk so they go unheeded. Perhaps a graphic campaign showing how malware can blight someone – their finances, their precious photographs, their privacy – is needed in that example.

  • 5. Roger  |  November 3rd, 2013 at 12:49 UTC

    The issue is what Max Weber called the “Iron Cage.” All large organisations tend toward bureaucracy, and in the modern West this has become all-encompassing in government and many large corporations. In the Iron Cage, actions are appropriate or inappropriate solely according to whether there is a formal, written procedure. There is no personal discretion permitted.

    The rationality of the procedure makes no difference: if a procedure will give a clearly nonsensical outcome, but is followed to the letter, then the official has behaved “appropriately.” He or she does not do this because of stupidity: deviation is not permitted, and this becomes so culturally ingrained that it is rare to even consider initiative, even if it would not result in disciplinary action.

    Even more crippling is the converse: if no procedure exists, then nothing can be done at all. Some bureaucratic organisations have dealt with this by “creeping credentialism.” For performing some set of skills, authorisation is conveyed by receiving a certificate of competence, which is issued after attending a course. For example, only a couple of decades ago, to use a ladder you either applied common sense, or perhaps a leading hand would give a 5 minute briefing on the Do’s and Don’ts of safe ladder operation. Such a slap-dash approach is intolerable to the bureaucrat, and today you require a certificate of competence. (No, I’m not joking!)

    So when a bureaucrat fails to advise people to take various actions to save themselves, he is not so much concerned about being sued. (In Commonwealth countries, government bureaucrats are largely immune from civil litigation anyway.) What he is concerned about — more likely, can barely even contemplate! — is giving suggestions to people who are not appropriately “qualified.” He cannot suggest sawing up fallen tree branches to people who may not have a tree-doctoring “ticket”, or suggest stockpiling food and medicine to someone who may not have completed a food handlers’ course, or to suggest that familys should develop an emergency plan when none of them have done a “Crisis Team Leader” course or God knows what else.

    If any of you have ever been forced to attend one of these courses, you will realise that the real irony is that about 2/3 of them are complete junk. They exist solely so that a bureaucrat can tick a box on a form stating that “only appropriately qualified people will undertake the task”, without the slightest evidence that a course “graduate” is any better able to do the task. Many have no objective pass standards, others take a whole day and 50 slides to cover material that could have been done (and done better) in just 3-5 minutes of “hands on”, and a few have no tangible content at all. I was on on course where a student was absent almost the whole day, except for “sign in” and “course validation.” He didn’t even take the test, yet still passed. Fortunately, that course wasn’t about operating chainsaws. That one only “qualified” him to enter financial data into the project management system of a very large organisation.

  • 6. Norman Yarvin  |  November 3rd, 2013 at 21:51 UTC

    One problem that people in government face is that most of the time they aren’t speaking to the public directly; if they don’t dumb down their message, odds are the press will do it for them. Or worse, the press will single out one bit, rip it out of context, and present it in a misleading way.

Leave a Comment

Required

Required, hidden

Some HTML allowed:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Subscribe to the comments via RSS Feed


Calendar

October 2013
M T W T F S S
« Sep   Nov »
 123456
78910111213
14151617181920
21222324252627
28293031