Thursday, April 03, 2014

Amazing Evidence for the Wisdom of Crowds

Check out the amazing results of something called the "Good Judgment Project," profiled here by NPR.   Three psychologists - Philip Tetlock, Barbara Mellers and Don Moore - have created this project in collaboration with members of the government intelligence community.   Over the past three years, the researchers recruited roughly 3,000 people to make probability estimates about geopolitical issues such as the threat of a North Korean missile attack.   The NPR article profiles one such person, who happens to be a pharmacist living in Maryland.  She does not have any special expertise in international affairs.  She received some basic training in probability estimation, and then set out to respond to various questions posed to her by the researchers.  Her responses are among the most accurate of all 3,000 average citizens participating in the study.   She credits Google searches as her most useful tool for learning about an issue before making a prediction.  The NPR story indicates that, "According to one report, the predictions made by the Good Judgment Project are often better even than intelligence analysts with access to classified information, and many of the people involved in the project have been astonished by its success at making accurate predictions."  

What's going on here?  First, it clearly demonstrates the concept of the wisdom of crowds.  If you pool the collective intellect of a large, diverse group of people, you can often get the right answers more frequently than you can by relying on a few experts.  Of course, the wisdom of crowds only works if you have members of that large diverse group making independent judgments.  If people are influenced by others, then the wisdom of the crowd breaks down.  Second, as the story indicates, "If you want people to get better at making predictions, you need to keep score of how accurate their predictions turn out to be, so they have concrete feedback."  The researchers have done that throughout the study.  Finally, why do the experts often stumble?  Well, they are subject to many biases.  For instance, they often fall into the confirmation bias trap.  They may have more data at their disposal, but they often rely on the information that confirms their pre-existing views.   Moreover, the experts perhaps are not making independent judgements.  They may be influenced to a large degree by others in their work group or agency.  In that way, they may be subject to conformity pressures.  

1 comment:

Anonymous said...

Hey Michael - I found this NPR piece on 'Crowd Sourcing data' very exciting and fascinating. Thanks for sharing. A documentary on netflix called 'Surviving Progress' also covers some of this interesting sector of human progress. Anyways keep up the great posts on Biz Strategy and corp development - truly enjoy.

Best,
Colin