The sound of the crowd
There’s been a lot of good stuff on various aspects of crowdsourcing lately. Makes me wonder if it is finally coming into its own. The debate still rages though about whether the crowd can outperform an expert at determining the best answer to a problem.
Ben Ramalingam has a good piece on his blog that looks at some of the latest research on the necessary conditions for crowds to be “smart” and outperform experts as well as some of the potential implications for the aid world. James Surowiecki also looked at this some years ago in his book “The Wisdom of Crowds” – I summarized some of the main points here a couple of years ago and what I saw then as some of the possibilities for aid and development.
One area where centralized expertise is still king in the aid world at least is in the area of guidance and procedures. We still often think it’s best to decide centrally what is best for our offices and programmes that are run far from the centre. Nick Milton wrote a great piece on this “Real men don’t f0llow procedures” which looks at why people often don’t follow centrally issued guidance – mainly because it’s seen to be incorrect and not to reflect the reality of those who have to implement it. His proposed remedy – involve users of guidance in writing guidance – using the expertise of a crowd of users over that of a smaller group of “experts”.
The example of guidance illustrates an important aspect of crowds and crowdsourcing that is often overlooked. The debate on crowdsourcing is often pitched as deep expertise of one versus the aggregation of very limited knowledge of many. In reality crowds often contain large degrees of expertise within them, often unevenly distributed, and which is not easily surfaced by traditional ways of working. Exploring the guidance issue further – users of guidance have a lot of direct relevant experience about what works and what does not which can be very helpful for developing procedures both collectively but also individually. They might not have a full global perspective of all situations where a procedure will be used, and they might not have technical and theoretical expertise in all areas the guidance covers – but they are nevertheless experts in their own right. So it’s not only a question of whether a crowd can outperform an expert – it might well be the case that their knowledge is complementary.
This idea quite likely applies to other topics too. Take crowdsourced data collection in emergencies which has created such a heated debate. It’s clear that those who live through and experience an emergency are not generally likely to be experts in emergency response or emergency data collection. But at the same time they are experts in actually experiencing the crisis in a way that external experts are not – and these can therefore be seen as complementary viewpoints or data points.
I couldn’t finish this blog post without a shoutout to my latest social media must use/time wasting distraction Quora. This is a very interesting experiment in crowdsourcing in that it combines features of existing crowdsourced question and answer systems (such as wikipedia, Yahoo answers etc.) but links them with social networking such that you can follow topics and questions and vote them up, but you can also follow the contributions of individuals. In using social networks, and also some good marketing they seem to have signed up a lot of people with expertise in various fields, some quite high profile, and are getting massive growth and are generating some interesting question and answers. At the same there’s a lot of dross and inanity there already. I’m curious to see as membership grows and more and more people are contributing things which to put it politely – they don’t have much expertise in – whether the content will go the way of youtube comments, or whether the good answers will be able to rise to the top. It would be interesting to see how much the theoretical conditions for smart crowds are met with Quora, and of course how well it actually works out in practice.