The problem with the big picture
In a previous post black and white I ranted about how most development discussions end up being very polarized. The recent scandal around Greg Mortenson, and the ensuing discussion leads me to think that the polarization is part of a larger phenomenon, a failure in our way of thinking about the world, which I’m going to call “Big Picture Syndrome”.
So my thesis – the universe is an extremely vast and complex place and our comparatively tiny brains can only really grasp a small part of it. This is a result of both any individual’s limited life experience, and their cognitive limitations.
But we need to make sense of the world we are in in order to make decisions and take actions. In order to do this we take shortcuts and make mental models of the world to help us understand and predict it. More sophisticated minds might also continually refine this model over time based on new information – but what we have is nevertheless a mental model of the world, one that is far less complex, and far less rich than the actual world we inhabit.
Humans are naturally inclined to finding mental shortcuts to explain the world – especially about those things we don’t have much direct knowledge of, and no easy means to verify. These mental models are heavily influenced by our experience which we assume is typical of the wider world, even though statistically speaking that’s unlikely to be true (for any one individual that is).
So this leads us to adopt particular positions whether it be about politics, the value of RCTs, celebrity activism, private sector involvement in aid, markets versus central plans etc.. And then we interpret subsequent experience in light of this and tend to seek to use new evidence confirm what we already believe (also known as confirmation bias). Often, only when the evidence contradicting our own beliefs becomes overwhelming and we experience “cognitive dissonance” are we willing to change our models.
A related problem is how many people react to scientific research. Researchers, quite rightly add all sorts of caveats and disclaimers to their work to explain what can and cannot be inferred from it – and usually this means that technically speaking very little can be said to be certain – but for non-experts in a field there is a desire either to over interpret the research to be much more definitive than it actually is (especially if it confirms your existing world view) or to dismiss it saying that if the researcher adds so many caveats, then it doesn’t really tell you anything with sufficient certainty to take it seriously. Until recently at least a good example as been how mainstream society, especially in the US has treated climate science research.
Another manifestation of “big picture syndrome” is how we treat “heroes”. Some people deliberately seek to be heroes, but many have this role thrust upon them. When we look at individual leaders it’s too easy to think of them as positive all round rather than as complex individuals with positive and negative characteristics and ones which vary depending on the circumstances they are in behaving honourably in one set of circumstances and dishonourably in another. This is known as the “Halo effect”. We turn a blind eye to the faults of those leaders we idealize until the evidence of their failings becomes too big to ignore – then we suddenly need to turn on them and find fault in everything they do -or to claim we always doubted them. We also often seek to personify a cause or issue by identifying individual leaders and focussing on them rather than the cause itself – since it is easier to relate to a person than an abstract concept.
Another manifestation of “big picture syndrome” is the idea that other professions from our own, especially ones we don’t understand are actually easier and less messy than our own. For example if you are not an economist. it’s hard to understand why economists don’t know more about running the economy and to have the sneaking feeling that if you studies a bit you could do a better job yourself (it can’t really be that hard can it), but if someone else were to attempt to do your job- well that’s different – it needs extensive training and experience, and even then we still don’t know many of the answers. This is probably a major reason why there are so many DIY aid workers who come into this work without real knowledge or experience.
What can be done about this? I don’t really have any good answers. Self-awareness and self-questioning- and certainly relying on the questioning of others to challenge our assumptions might be a good place to start.
Of course this whole blog post is a gross oversimplification – I hope you can live with that.