The trouble with the world is
that the stupid are cocksure
and the intelligent are full of doubt.
In the instance that prompts this post, X = "overconfidence in making decisions." If we view overconfidence as lacking enough knowledge to make a sound decision, then you can see the relationship to the quote above. This is a question of leadership and strategic path-finding--essential qualities of any university administrator.
The researchers are Dominic D. P. Johnson, and James H. Fowler, who posted their article on the physics pre-print server arXiv.org, indexed as arXiv:0909.4043v1. I found it on the MIT Technology Review arXiv blog here. From the abstract:
Here, we present an evolutionary model that shows overconfidence actually maximizes individual fitness and populations will tend to become overconfident, as long as the resources at stake during conflicts exceed twice the cost of competition. This is because overconfident individuals make more challenges when there is uncertainty about the strength of opponents (and thus the outcome of conflicts), while less confident individuals shy away from many conflicts they would win. Where the value of a prize is at least twice the cost of trying, overconfidence is the best strategy. The model suggests that the conditions under which humans would have evolved to have a "rational" unbiased view of their own capabilities are exceedingly rare, and it helps to explain why resource-rich environments can paradoxically create more conflict. Moreover, the fact that overconfident populations are evolutionarily stable may be one reason why overconfidence persists today in politics, business, and finance, even if it causes occasional disasters.There's a lot to absorb there. There are indications from other sources to support some of the assertions. For example, the implication that most humans can't accurately self-assess their abilities. The Dunning-Kruger effect is the inability to realize one's own incompetence.
From a review at nytimes.com "Among the Inept, Researchers Discover, Ignorance is Bliss":
People who do things badly, Dr. Dunning has found in studies conducted with a graduate student, Justin Kruger, are usually supremely confident of their abilities -- more confident, in fact, than people who do things well.From the wikipedia page:
"Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it," wrote Dr. Kruger [.]
It gets worse, because leaders aren't necessarily chosen because they are the most able. A 2008 Live Science article is entitled "Narcissists Tend to Become Leaders." After assessing personality traits, undergraduates were put into groups of four and asked to elect a leader. Conclusions from the research include:
Kruger and Dunning gave subjects specific tasks (logic, grammar and telling funny from unfunny jokes). Subjects had to judge how well they had done on these tasks, relative to the rest of the group. This enabled a direct comparison of their real and believed ranks. For analysis, the results were divided into four groups, depending on actual task scores.All groups put themselves above average. This meant that the lowest-scoring group (the bottom 25%) showed a very large illusory superiority. Although their test scores were in the 12.5th percentile on average, they estimated themselves to be in the 62nd. Kruger and Dunning explained that those who were worst at the tasks were also worst at recognising skill in those tasks.
[A] new study shows individuals who are overconfident about their abilities are most likely to step in as leaders, be they politicians or power brokers.Meanwhile, ScienceBlog reports on a study in Sociological Inquiry:
However, their initiative doesn't mean they are the best leaders. The study also found narcissists don't outperform others in leadership roles.
Narcissists tend to be egotistical types who exaggerate their talents and abilities, and lack empathy for others.
This is echoed in a similar piece in Kellogg Insight:
Co-author Steven Hoffman, Ph.D., visiting assistant professor of sociology at the University at Buffalo, says, "Our data shows substantial support for a cognitive theory known as 'motivated reasoning,' which suggests that rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe.
"In fact," he says, "for the most part people completely ignore contrary information.
People have preferences. But they cannot choose any old thing they like because they have to be able to rationalize the choice[.]Russell was also right in that too much information allows creation of bad theory. Other research shows that fear leads to aversion of pain and bad decisions as in "scared money never wins", and the pull of irrational behavior is strong.
Rationalization means that people are constrained optimizers, and one of the constraints [in the way of choosing a preference] is that they have a psyche that requires a rationale[.]
- When the stakes are high, taking risks is beneficial on average to the individual (but not to the group, necessarily)
- Ignorance is a fixed point under observation: you don't know it when you have it.
- Leaders tend to be those who over-rate their own abilities
- Decisions tend to be based on internal biases
- Once decisions are made, even on insufficient evidence, people stick to them with explanations that reinforce their decision
- Decisions tend to be degraded by fear and too much information.
Case study: Russia's 1917 offensive. Kerensky was the Minster of War in the coalition government. As Orlando Figes puts it in A People's Tragedy: (selected quotes, starting on page 410)
All the nation's hope and expectations rested on the frail shoulders of Kerensky [...]There were indications that the fighting men were not crazy about his idea of the offensive that the Minister had his heart set on, however:
Kerensky reveled in his role. He had always seen himself as the leader of the nation, above party or class interests. The adulation went to his head. He became obsessed with the idea of leading the army to glory and of covering himself in honor. He began to model himself on Napoleon.
[A]mong the vast majority of the rank and file, the mood of the soldiers was much more negative. Kerensky was frequently heckled by such troops during his trips to the Front, yet he never seemed to register the warning that this conveyed.The general in charge, Brusilov, had second thoughts too:
[H]e found the troops in a state of complete demoralization. According to one of this senior aides, Brusilov had to avoid using the words 'offensive' or 'advance' in case the soldiers attacked him.Despite these warning signs, the attack went ahead. Figes sums up the effect of the offensive (pg 408):
As Brusilov saw it, the soldiers were so obsessed with the idea of peace that they would have been prepared to support the Tsar himself, so long as he promised to bring the war to an end.
With hindsight it is clear that the military and political leaders of the Provisional Government were deluded by their own optimism. [...] [One prediction was] that the Russian losses would be in the region of 6,000 men; but the actual number turned out of be just short of 400,000, and the number of deserters perhaps even greater. This was a huge human price to pay for a piece of wishful thinking. Politically, the costs were even higher. For there is no doubt that the launching--let alone the failure--of the offensive led directly to the summer crisis which culminated in the Bolshevik seizure of power in October.It's impossible to know how the 20th century would have turned out differently if better leadership had been exhibited, but certainly this example of narcissism and willful ignorance led to one of our greatest tragedies.
Note: If you're at all interested in Bertrand Russell and his quest to put mathematics on a logical foundation, then do yourself a favor and check out Logicomix--a most unlikely graphic novel about this very topic.