Jessica Stillman writes this week in Inc. about a new research study conducted at Duke University. Mark Leary and his colleagues conducted a series of studies examining the impact of "intellectual humility." Stillman writes, " In everyday language, it means the willingness to accept that you might be wrong and to not get defensive when arguments or information that's unfavorable to your position comes to light. And according to this new study, those who lack this quality make markedly worse choices that those who have it in abundance."
Stillman notes that others have written about this concept previously, though this research at Duke provides strong statistical evidence of the impact of intellectual humility. An article published on Duke's website cites some of the key findings: "People who displayed intellectual humility also did a better job evaluating the quality of evidence -- even in mundane matters. For instance, when presented with arguments about the benefits of flossing, intellectually humble people correctly distinguished strong, fact-based arguments from weak ones."
Stillman notes that Stanford's Bob Sutton wrote about intellectual humility a decade ago, and he explained using a phrase made famous by people at the Institute for the Future in Palo Alto, California. I love the phrase as it conveys the essence of intellectual humility. Sutton explains that people should have "strong opinions, weakly held." Here is Sutton's full explanation of this phrase:
Perhaps the best description I've ever seen of how wise people act comes from the amazing folks at Palo Alto's Institute for the Future... they advise people to have "strong opinions, which are weakly held." They've been giving this advice for years, and I understand that it was first developed by Institute Director Paul Saffo.
Bob explained that weak opinions are problematic because people aren't inspired to develop the best arguments possible for them, or to put forth the energy required to test them. Bob explained that it was just as important, however, to not be too attached to what you believe because, otherwise, it undermines your ability to "see" and "hear" evidence that clashes with your opinions.