KM on a dollar a day

Musing on knowledge management, aid and development with limited resources

Action–(Over)Reaction

with one comment

“For every action, there is an unequal and opposite overreaction.”* could have been Newton’s law of human dynamics.

I’m sure you can’t have missed the Sandy Hook school shooting in the news – and how it is provoking a discussion on how to keep kids safe, gun control, mental health etc.

If you are a parent with school age children, you have probably also encountered a hurried response of new security measures in your schools designed to allay the fears of parents – and to well – be sure to be seen to “do something”.

In my school district these were immediately put to the test when apparently, someone who works in security and who carries a firearm for his job, foolishly forgot to leave his weapon in the car when visiting the school premises to pick up something he had left there earlier. The school went into full lockdown, parents were called, activities cancelled, news interviews given and there is an expectation that security will be tightened further.

But from what I can gather – there wasn’t a real threat from this intruder. And even the new measures that have been put in place are substantially less than the precautions already in place in Sandy Hook but which were unable to prevent the tragedy.

Those of us who remember 9-11 (or any other major tragedy) have probably seen something similar happen. Just think how much more inconvenient and expensive air travel is now than it was before 2001. Outside the security area similar things have happened with financial fraud (think the compliance heavy Sarbanes-Oxley financial reporting requirements in the US), or with the financial crisis. Or more mundanely this can happen inside an organization if there is a minor fraud or a bad audit, or even a visibly unsuccessful project.

But while the instinct to respond quickly to ensure “never again” is understandable – some of our initial reactions are not always the most effective or sustainable.  I’ve written previously about the dangers of over-regulation and of how that can sometimes give the illusion of control over a situation when in fact by making things over-complicated we create unforeseen side-effects and in fact may be less in control and less well-informed of the situation rather than more.

A few of the risks of reacting too quickly and heavily in response to a crisis include:

  • The measures we put in place may look tough and be popular – but they also might not in fact significantly reduce the risk we are trying to manage or eliminate. Often in crisis there is a temptation to pick measures based on our perceptions of what they do, rather than what they actually do (an example might be “racial profiling” of “middle-eastern looking” passengers at airport security).
  • They might address the immediate manifestation of a risk, but not the underlying issues that cause it in the first place (an obvious example in school shootings is that while enhanced security *might* reduce the risk of a shooting they don’t address the reasons why these shootings might occur in the first place).
  • The measures we put in place might have undesirable consequences – for example a proposal to arm teachers might create a new opportunity for accidental shootings. Or heightened security might make children feel more anxious rather than more secure, or take away funding for teachers and classrooms. Or in finance – burdensome reporting rules might be too onerous for small business making them less competitive and might create an industry of people finding creative ways to avoid the rules among larger ones who have the resources to do this.
  • Lack of proportionality – no risk can be totally eliminated, and the closer you try to reduce it to zero the more expensive it will be. Similarly not all threats are equally likely or have equal consequences. This means there are limits to how much one should be willing to spend, or on how much inconvenience one is prepared to put up with depending on the likelihood of the threat, its severity and the cost of reducing the risk of it happening.

In a sense this type of reaction is an example of a “logical fallacy” (although not one not covered by this fabulous infographic).  I didn’t know this before, but apparently in security circles this is known as the “Affect Heuristic” – which Wikipedia defines as “…is a mental shortcut a mental shortcut that allows people to make decisions and solve problems quickly and efficiently, in which current emotionfear, pleasure, surprise, etc.—influences decisions. In other words, it is a type of heuristic in which emotional response, or “affect” in psychological terms, plays a lead role. It is a subconscious process that shortens the decision-making process and allows people to function without having to complete an extensive search for information….”

While this explanation stresses the value of this response as an immediate response to danger when full information is not available – a kind of “gut” survival instinct – what is troubling about it is its durability i.e. when the initial danger has passed we often stick with our initial response despite having the time to do a better analysis of a situation to understand the threat, the underlying issues that lead to the threat, its likelihood, potential consequences and the methods and costs of reducing the risk (which include looking at underlying causes of the threat not only its immediate manifestation).

This is probably a combination of how the human mind works, and how our political and decision-making systems work – i.e. on the need to appeal to popular emotional reaction to an event rather than to proportionally tackle its real causes . I don’t have an immediate answer as to how we can reduce the risk of inappropriate overreaction to negative effects – but maybe if we talked about this a bit more – including demonstrating some of the problems of this reasoning with data, it might be a good start.

Postscript: Last night, after posting this blog I realized that this phenomenon also often applies to the international response, especially the aid response, in emergencies. Thinking back on both the response to the Asian Tsunami and the Haiti earthquake a few similar things are visible i) an overreaction in terms of the resources and attention provided – at least in comparison with the under attention to other more chronic emergencies  – and this often led to piles of money which needed to be disbursed through poorly thought out programmes ii) a quick choice of inappropriate responses from some such as providing goods that were inappropriate and couldn’t easily be used aka #swedow – stuff we don’t want. Examples are shoes, second hand clothes, toys, unskilled volunteers …. iii) a failure to address underlying  causes instead of dealing only with the surface. Of course its hard to deal with the underlying cause of earthquakes – but you can and should look at the reasons why a country isn’t sufficiently resilient to deal with them effectively.

All the more reason for more research and public education on what actually works – whether in emergency response, or in security.

*this quote is not mine but I couldn’t find the original source.

Written by Ian Thorpe

January 8, 2013 at 4:25 pm

One Response

Subscribe to comments with RSS.

  1. […] See the article here: Action–(Over)Reaction […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: