One of my aunts, more comfortable with results than principles, used to bring her granddaughter to the family Easter egg hunts. Over and over, as she would spot eggs that her granddaughter hadn’t seen, she would intone, “Karen Lee, open up your big brown eyes!” She thought, as most people do, that you can see what’s there simply by opening up your eyes and looking.
Magicians consistently disprove this notion. Harvard psychologists quantified it. In a well-known study, they showed subjects a video of basketball players passing the ball and asking them to count the number of passes made by either the white-uniformed or the black-uniformed team. In the middle of the video one of two strange things happened, lasting for about five seconds—either a woman with an umbrella or a person in a gorilla costume walked through the center of the action. Thirty-five percent of the subjects failed to notice the woman, 56% missed the gorilla, though both the woman and the gorilla were obvious to anyone not engaged in the counting task.
Those who had been given no instructions but merely watched the video came to their task with “an attitude open to an unfamiliar world, accepting of whatever was there. There was no model and there were no expectations. The order ‘Tell me what you see’ produces curiosity. The order ‘Count the passes’ produces a closed system, a narrowing of attention directed at a particular task, which fills up working memory. The implicit assumption is that you know what you’re doing and know what sort of perceptual input you want. … Such a closed attitude can prevent new perceptions from being incorporated into the model. Such a closed attitude can kill you.” (p. 77) The positive version: “Some people update their models better than others. They’re called survivors.” (p. 79)
Count the bars, recognize the pattern, and—oops—miss that gorilla in the room. Or broaden your focus so that you at least see the gorilla, and then decide whether or not it’s relevant.
The final takeaway for this two-part post on Deep Survival comes from the theory of risk homeostatis, which says that people accept a given level of risk. “While it’s different for each person, you tend to keep the risk you’re willing to take at about the same level. If you perceive conditions as less risky, you’ll take more risk. If conditions seem more risky, you’ll take less risk.” (p. 112) For instance, when antilock brakes were introduced, the expectation was that the accident rate would be go down. In fact, it went up because people figured that driving was safer with the new brakes so they drove more aggressively.
People tend to normalize risk which means, in part, that “if you’ve tallied a lot of experience in dangerous, iffy environments without significant calamity, the mental path of least resistance is to assume it was your skill and savvy that told the tale.” Ah, but every new experience is different. “… even if you are intimately familiar with [the mountain’s or a trading instrument’s] subtleties of character, it can make a mockery of the most thoughtful plans. Experience is nothing more than the engine that drives adaptation, so it’s always important to ask: Adaptation to what? You need to know if your particular experience has produced the sort of adaptation that will contribute to survival in the particular environment you choose. And when the environment changes, you have to be aware that your own experience might be inappropriate.” (p. 113)
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment