Risk Savvy Summary (9/10) — Unearned Wisdom
In , Gigerenzer makes the contrarian case that the world doesn’t need more experts, more number crunching, more big data, but rather, a better education about the nature of risk. He contrasts risk with uncertainty. Risk is associated with controlled environments with fixed rules, such as casinos.
These are places that allow us to make precise probability calculations. Uncertainty, often mistaken for risk, is quite different. The stock market is not about risk, it’s about uncertainty. Since stock prices are determined by real world events, and unlike casinos, real world events don’t operate under a controlled environment with fixed rules, then we cannot apply our typical understanding of probabilities to the messy and highly complex world of the stock market.
That’s why stock market “experts” are no better than amateurs at picking stocks; often, they are even worse. Big life decisions such as who to start a company with, who to marry, and where to live, are decisions that are not about risk, but about uncertainty. In these cases, Gigerenzer thinks that we should depend on our intuition and simple rules of thumb. Intuition is not some magical ability that only some people have. All humans have intuition, and it is based on unconscious learning, and is therefore, a highly intelligent mechanism for making decisions. Rules of thumb are also highly intelligent, since it considers only the most important variable, rather than try to optimize for complicated variables.
Take for instance, the act of catching a ball. If you wanted to reliably become a good catcher, it would a lot more effective for you to simply fix the center of your visual field on the ball as it moves, and to keep it there by moving with it, rather than mathematically calculate the trajectory and speed of the ball in order to determine your location and the position of your hands a few moments later.
Should you always go with shortcuts (or heuristics) and intuition rather than rely on mathematics? No. But people are often confused about when to use either. In casinos and in fixed environments, it is obviously better to go with the numbers. It would be silly to think that your intuition can help you beat a slot machine that is designed to pay out 80 percent of the time. But it would also be silly to use mathematical equations to try to predict the movements of the stock market. The ironic thing is that many people do exactly that. They forget about the mathematics inside a casino, but when they are trying to play the stock market, they suddenly develop the urge to rely on Fibonacci sequences and mathematical instruments. This illustrates a basic confusion between risk and uncertainty.
Gigerenzer also explains how medical doctors fall prey to the same mistakes when calculating risk. For example, regular check ups are not effective at preventing disease, yet people are obsessed with getting them. Much more important is to avoid things that make you sick in the first place, like smoking, unhealthy food, and a sedentary lifestyle.
Traffic Accidents and 9/11
The increase in road travel had sobering consequences. Before the attack (of 9/11), the number of fatal traffic accidents remained closely around the average of the previous five years (the zero line in Figure 1–2). Yet in each of the twelve months after 9/11, the number of fatal crashes was above average, and most of the time, even higher than anything that happened in the previous five years. All in all, an estimated sixteen hundred Americans lost their lives on the road due to their decision to avoid the risk of flying. P.10
RISK: If risks are known, good decisions require logic and statistical thinking.
UNCERTAINTY: If some risks are unknown, good decisions also require intuition and smart rules of thumb.
Most of the time, a combination of both is needed. Some things can be calculated, others not, and what can be calculated is often only a crude estimate.
The Three Faces of Probability
One important fact is often overlooked. Probability is not one of a kind; it was born with three faces: frequency, physical design, and degrees of belief.
And these have persisted to this day.
Frequency: In the first of its identities, probability is about counting. Counting the number of days with rainfall or the number of hits a baseball player makes and dividing these by the total number of days or strikes results in probabilities that are relative frequencies. Their historical origins lie in seventeenth-century mortality tables, from which life insurances calculated probabilities of death.
Physical Design. Second, probability is about constructing. For example, if a die is constructed to be perfectly symmetrical, then the probability of rolling a six is one in six. You don’t have to count. Similarly, mechanical slot machines are physically designed to pay out, say, 80 percent of what people throw in, and electronic machines have software that determines the probabilities. Probabilities by design are called propensities. Historically, games of chance were the prototype for propensity. These risks are known because people crafted, not counted, them.
Degrees of Belief. Third, probability is about degrees of belief. A degree of belief can be based on anything from experience to personal impression. Historically, its origin is in eyewitness testimony in courts and, more spectacularly, in the Judeo-Christian lore of miracles. To this day, the testimony of two independent witnesses counts more than that of two who talked with each other beforehand, and the same holds for the testimony of a witness who did not know the defendant than that of his brother. But how to quantify these intuitions? That was the question that gave rise to degrees of belief expressed as probabilities. P 25
Uncertainty (Stocks, Romance, Earthquake, Health, Business) is confused with Risk (Casino, Slots, Lottery) which is confused with Certainty.
It is not always easy to know how uncertain the situation is in which we find ourselves, whether it contains known risks or is largely unpredictable. Let’s
begin with a story by writer Nassim Taleb.24 Put yourself into the mind of aturkey. On the first day of your life a man came. You were afraid that he might kill you, but he was kind and gave you food. Next day, you see the man approach you once more. Will he feed you again? Using probability theory, you can calculate the chance that this will happen. The rule of succession, derived by the great mathematician Pierre-Simon Laplace, provides the answer:
Probability that something happens again if it happened n times before = (n+1)/(n+2)
Here, n is the number of days the farmer fed you. That is, after the first day, the probability that the farmer will feed you the next day is 2/3, after the second day it increases to 3/4, and so on, growing more and more certain every day. At the same time, the alternative that he might kill you becomes less and less likely. On day one hundred, it is almost certain the farmer will come to feed you, or so you might think. But unknown to you, that day is the day before Thanksgiving. Just when the probability of being fed is higher than ever before, you’re dead meat.
Let’s sum up:
1. RISK ≠ UNCERTAINTY. The best decision under risk is not the best decision under uncertainty.
2. RULES OF THUMB ARE NOT DUMB. In an uncertain world, simple rules of thumb can lead to better decisions than fancy calculations.
3. LESS IS MORE. Complex problems do not always require complex solutions. Look for simple solutions first.
Future is hard to predict.
The Uncertain Future Is Hard to Predict
“Predictions are hard, especially about the future.” — Niels Bohr (also attributed to Mark Twain, Yogi Berra, and a host of others)
In 1876 Western Union, the largest American telegraph company, refused to buy Graham Bell’s patent for one hundred thousand dollars, arguing that people are not savvy enough to handle a phone: “Bell expects that the public will use his instrument without the aid of trained operators. Any telegraph engineer will at once see the fallacy of this plan. The public simply cannot be trusted to handle technical communications equipment.”29 A group of British experts thought somewhat differently: “The telephone may be appropriate for our American cousins, but not here, because we have an adequate supply of messenger boys.”
A few years later, a committee of the British Parliament evaluated Thomas Edison’s lightbulb and concluded that it would be “good enough for our trans-Atlantic friends…but unworthy of the attention of practical or scientific men.”
“Radio has no future.” Attributed to Lord Kelvin, former president of the Royal Society, ca. 1897.
“Rail travel at high speed is not possible because passengers, unable to breathe, woulddie of asphyxia [suffocation].” Dr. Dionysius Lardner (1793–1859), professor at University College London, and author of a book on the steam engine, was one of several doctors who prophesized that the rapid movement of trains would cause death or brain trouble among travelers and vertigo among onlookers.
Car pioneer Gottlieb Daimler (1834–1900) believed that there would never be more than one million cars worldwide because of the lack of available drivers. Daimler based this prediction on the false assumption that cars would have to be operated by chauffeurs.
Howard Aiken, who constructed the Mark I computer for IBM in 1943, reminisced: “Originally one thought that if there were a half dozen large computers in this country, hidden away in research laboratories, this would take care of all requirements we had throughout the country.” This prediction was based on the false assumption that computers would solve scientific problems only.
As we have seen, experiencing a visual illusion means making a good error. Good errors are errors that need to be made. Children are known for these. Consider a three-year-old who uses the phrase “I gived” instead of “I gave.” The child cannot know in advance which verbs are regular and which are irregular. Because irregular verbs are rare, the child’s best bet is to assume the regular form until proven wrong. Such errors are good, or functional, because if the child decided to play it safe and use only those verbs he has already heard, he would learn at a much slower pace. Learn by failing, or you fail to learn.
Serendipity, the discovery of something one did not intend to discover, is often a product of error. Christopher Columbus wanted to find a sea route to India. He believed he could reach India by ship because he made an error: He grossly underestimated the diameter of the globe. Others knew better and criticized his plan as foolish. They were right. But because of his error, Columbus discovered something else, America. Similarly, some of my own discoveries were never planned, such as the discovery of the “less-is-more effect.” Here is the story.
For an experiment, we needed a set of easy questions and a set of hard ones. Because those who took part in the experiment were German, we came up with questions about the population of German cities (which we assumed would be easy) and U.S. cities (hard). We chose the seventy-five largest cities in each country. For instance,
“Which city has a larger population: Detroit or Milwaukee?”
“Which city has a larger population: Bielefeld or Hanover?”
The result blew our minds. Germans didn’t do best on questions about German cities, about which they knew lots, but slightly better on American cities, about which they knew little. We’d made an error in assuming that knowing more always leads to better inferences. The experiment was ruined.
But this error led us to discover something new, which we called the recognition heuristic:
If you recognize the name of one city but not that of the other, then infer that the recognized city has the larger population. Many Germans had never heard of Milwaukee, and so they correctly concluded that Detroit has the larger population. Because they were familiar with both Bielefeld and Hanover, however, the rule of thumb didn’t work for this question. An American who has never heard of Bielefeld will correctly infer that Hanover has more inhabitants, but Germans have a hard time.
Similarly, in another study, only 60 percent of Americans correctly answered that Detroit is larger than Milwaukee, while some 90 percent of Germans got it right. The recognition heuristic takes advantage of the wisdom in semi-ignorance. This simple rule doesn’t work all the time, only when bigger objects are indeed more widely recognized.
Good errors help us to learn and to discover. A system that makes no errors will learn little and discover even less.
Educators often think of building young minds that ideally make no errors.
This view is an example of a bad error. Intelligence, creativity, and innovation will cease if people are prohibited from making errors. That does not mean that every error is good. The spread of AIDS in Africa was dramatically underestimated by the World Health Organization (WHO),whose computer models assumed that the probability of infection increased with the number of sexual contacts, independent of the number of sexual partners. But ten contacts with one partner lead to a much lower chance of infection than one contact with ten different partners.
Steady sources of bad errors are the zero-risk illusion and the turkey illusion. For instance, banks continue to use models such as value-at-risk that assume that all risks are known and can be estimated precisely, even though this illusion of certainty contributed to the financial crisis.
Blunders like these are not only embarrassing in hindsight but can be disastrous. Bad errors are errors that are not functional and should be avoided in the best of everyone’s interest.
Ask whether checklists are used; if the answer is no or not forthcoming, choose a different hospital.
Digital communication technology-from the Internet to Facebook to digital eyewear to technologies we cannot yet imagine-deeply influences what we spend our time on, what privacy means, and how we think. The question is not whether the digital media will change our mental lives; they do. The question is how. Digital technology provides huge opportunities and is not the problem. The problem lies in us, whether we remain at the helm or are instead remote-controlled by new technology. Digital media have already changed the way people conduct their social relations and the risks they are willing to take. In an interview, three Connecticut high school students explained why they send text messages while driving.
Roman says he is not going to stop: “I know I should, but it’s not going to happen. If I get a Facebook message or something posted on my wall . . . I have to see it. I have to.” Similarly, Maury does not give reasons but expresses a need to connect: “I interrupt a call even if the new call says ‘unknown’ as an identifier-I just have to know who it is. So I’ll cut off a friend for an ‘unknown.’ I need to know who wanted to connect. . . . And if I hear my phone, I have to answer it. I don’t have a choice. I have to know who it is, what they are calling for.” Marilyn adds: “I keep the sound on when I drive. When a text comes in, I have to look. No matter what. Fortunately, my phone shows me the text as a pop up right up front . . . so I don’t have to do too much looking while I’m driving.”
These three students are willing to risk a car accident in order to gratify their need to connect digitally. When asked when was the last time they didn’t want to be interrupted, there was silence. “I am waiting to be interrupted right now,” one said. Interruption has become connection. Even in the physical company of real friends, there is a strong urge to be contacted by someone else online. Digital technology has taken control over these young people’s risk taking and social relations. It has also changed some parents’ relations to their children.
Because it enables constant monitoring, parents often do so; the result is higher parental anxiety. As one mother agonized: “I’ve sent a text. Nothing back. And I know they have their phones. Intellectually, I know there is little reason to worry. But there is something about this unanswered text.” The same mother envied her own mother, who didn’t worry back then. Children went to school, came home. Her mother worked and returned around six. Today some children have a fantasy of their parents simply waiting for them, expectantly-without having called them twice on the way home. “I’d like to make a call” has changed to “I need to make a call.” The ability to be alone and to reflect on one’s emotions in private runs counter to the spirit of digital networking; teenagers confess to discomfort when they are without their cell phones.
A study reported that two thirds of Britons concentrate so hard on their mobile phone when texting that they lose peripheral vision.14 According to legend, after pedestrians started walking into lampposts, some cities padded the posts. Are teens hooked to digital media at least happy?
A study with over one hundred fourteen- to seventeen-year-olds with excessive Internet use reported that only 10 percent were very happy with their leisure time (compared to 39 percent in a control group of peers), 13 percent with their friends (versus 49 percent), 3 percent with themselves (26 percent), and only 2 percent with their life in general (29 percent).15 These teens had virtually stopped reading and completely stopped going to events and being engaged with society. Digital risk competence is the ability to harvest the benefits of digital technology while avoiding harm. It has a cognitive and a motivational component: risk literacy and self-control.
Originally published at https://unearnedwisdom.com on March 21, 2022.