Earlier I wrote about two views of probability that Riccardo Rebonato outlined in his book Plight of the Fortune Tellers. Today I’m turning my attention to the System I and System II modes of probabilistic assessment, presumably named by someone utterly lacking in imagination. “The former provides fast, approximate, but not very accurate responses. The latter is more deliberative, much slower, and correspondingly more accurate. Some neurophysiologists believe that the distinction is ‘real,’ i.e., that different parts of the brain are actually engaged in System I and System II cognitive operations.” (p. 28)
Rebonato opts to focus on System I responses to uncertainties. He starts with our ancestor in the jungle where a bush suddenly starts to rustle. “It does not matter greatly if the probability of the rustling being due to a crouching leopard poised to spring is 42.8% or 61.7%: quickly running away is an appropriate response in either case.” (pp. 28-29) Our ancestors had to develop quick-and-dirty rules—run quickly but don’t be a nervous Nellie, because “there [was] no safe way to err.” (p. 29)
Heuristics, or rules of thumb, that inform our probability assessments are well documented in behavioral finance literature. They often do a surprisingly good job. System I allows us to deal “(imperfectly, but adequately enough to survive) with probabilities in the range, say, 10-90%. But the more remote the risk, the more difficult it is for the evolutionary advantage of being able to assess this risk efficiently to establish itself." (p. 35)
For instance, in studies looking at how much people are willing to pay for insurance certain irrationalities arise. If the perceived probability of a risk is very low people do one of two things. Either they mentally set the probability to zero and therefore refuse to buy insurance at all. Or they set the probability above a certain threshold and are willing to overpay for insurance. (Think, for instance, of the willingness of investors to buy protective puts against a market decline of, say, 50%. They either assume a “What, me worry?” attitude or they often pay a hefty premium for their “portfolio insurance,” especially if the market has had a few rough days.)
The upshot is that the rules of thumb that we’ve developed to get along in the world assume a world without black swans. Black swans addle the System I brain. And System II analytical methods not only are deprived of the use of System I heuristics; they “have to fight against our evolutionarily honed probabilistic rules of thumb, i.e., against voices whispered from deep inside our psyche.” (p. 39) No wonder we keep getting ourselves into such big trouble.