Showing posts with label Biases. Show all posts
Showing posts with label Biases. Show all posts

Monday, June 10, 2019

Reading Room Material: Misbehaving

Your Tank is Empty

You glance down at your gas gauge. You're coasting on fumes. Time to fill 'er up! There are two gas stations across the street from each other. Which gas station would you choose? 

Gas Station A: The price for gas is $2.45 if you pay with cash, and a $0.05 surcharge for using a credit card.

Gas Station BThe price for gas is $2.50 for using a credit card, and a $0.05 discount for paying with cash. 

What is "Behavioral Economics?"

I'm not sure if this is apocryphal or not, but I believe Hitler famously asked, "Guns or butter?" [1] At first glance, his statement seems like a false dichotomy. But if you dig a little deeper, you find that the common denominator is nitrogen. You can either turn nitrates into gun powder or you can convert them into fertilizer [2].

What does "guns and butter" have to do with Economics? Simply put, Economics is the scientific examination of making decisions when resources are scarce. In a world of finite resources, you have to pick. Do you want guns and tanks, or do you want plows and tractors? You can't have both, so you have to choose.

Where, then, does the "behavioral" aspect of behavioral economics come into play? I'm glad you asked because Dr. Richard Thaler, author of the book Misbehaving, has a terrific answer [3].


What's the Big Idea? 

Before delving into Dr. Thaler's big idea, a little context is necessary. Every scientific theory is based on a few core assumptions. Economics is no different. Neoclassical Economics assumes that people are rational and are able to make optimal decisions to further their own agenda. In other words, I know what makes my life worth living. It might be different from yours, but we both are able to decide what's best for ourselves. In economic jargon, we are experts at maximizing our own utility (read: "happiness"). For simplicity, Thaler refers to these rational types of people as Econs.

Assuming people are rational, various factors are deemed irrelevant when making decisions. In the scenario that opened this post, both options should be equally appealing to an EconIf we observed Econs filling up their cars, about half of them would choose Station A and the other half would choose Station B. 

But what do we observe when we watch Humans purchase gas? Which gas station would you purchase gas from? Gas stations that offered a surcharge on credit cards quickly learn that humans overwhelmingly prefer discounts.

Thaler's book is chocked full of big ideas. Probably the biggest among them is the idea that economic theory is based on the faulty assumption that we are Econs instead of Humans. Unfortunately for Neoclassical Economics, humans are swayed by supposedly irrelevant factors, or "SIFs" for short.


The Big SIF: The Sunk Cost Fallacy

What's an example of a SIF? Personally, one of the hardest lessons for me to learn was to ignore "sunk costs." I bet this has happened to you, so I'm sure you can sympathize. A couple of years ago, I bought a ticket to a brew festival. The lineup of craft brewers was incredible, and I was excited to try some new libations. But on the day of the beer fest, I came down with fever. I tried to coax myself to go. I couldn't let all that money go to waste! But then a lesson from college came back to haunt me. That money is gone. I can't jeopardize my health because that would be considered, "throwing good money after bad." In the end, I had to stay home (and ignore the sunk cost of my ticket!). 

The sunk cost fallacy is just one example of many SIFs that Dr. Thaler describes and had a hand in discovering. By the end of the book, you are left wondering how Economist can continue to believe in a rational individual. More importantly, How must economic theory change to accommodate these SIFs? I guess it's up to us misbehaving Humans to figure it out. 


Share and Enjoy!

Dr. Bob

More Material

[1] According to the all-knowing wikipedia, it looks like Hitler didn't coin the term, but the Nazis did use the concept of "guns or butter" in their propaganda. 

[2] Making decisions on a stingy planet also calls to mind our discussion on Opportunity Costs.

[3] Thaler, R. H. (2015). Misbehaving: The making of behavioral economics. WW Norton & Company.

Thursday, December 21, 2017

Joined At the Hip: The Conjunction Fallacy

Learning By Doing

Consider the following scenario: 

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
  1. Linda is active in the feminist movement.
  2. Linda is a bank teller.
  3. Linda is a bank teller and is active in the feminist movement.
Your task is to order the three descriptions from most likely to least likely. In other words, is Linda more likely to be a feminist than a bank teller (or vice versa)? Once you have rank-ordered the descriptions, feel free to read on.


Scientists on a Plane

After attending a conference, I was on flight home when a woman with poster tube sat down next to me (I had the middle seat...lucky me). Curious, I asked if she was a scientist or an architect? Her reply took me by surprise. "I'm a scientist. Don't I look like one?" I feebly tried to explain that my guess wasn't based on looks. Instead, I was trying to use information about the base rates of scientists in the US. I assumed there are more scientists than architects [1]. 

I think her question highlights a very common way of thinking. It is extremely easy to ignore the frequency of a particular class or the probability of the occurrence of a specific event. In other words, it is easy to forget about base rates (e.g., the number of female social scientists employed in the US) when there is a concrete example immediately in front of you.

Let's return to the question that opened this post. I haven't taken the time to investigate how many feminists there are in the United States, nor have I tried to mine the US Census data to figure out the number of employed bank tellers. But I know for certain that there are fewer feminist bank tellers than there are bank tellers or feminists. How do I know that? Because it is logically impossible to have more feminist bank tellers than either subgroup. The following Venn diagram proves this logical necessity (see Fig. 1).


Figure 1. A Venn diagram depicting feminist bank tellers.


"We are connected/Siamese twins/At the wrist" –Smashing Pumpkins, Geek U.S.A.

Forgetting that overlapping populations are less frequent than their parent populations goes by the term Conjunction Fallacy. This mental slip was first introduced by Amos Tversky and Daniel Kahneman in the early 80s. They noticed that they themselves would initially make this error; however, after they had sufficient time to think about it, they realized their mistake. This dynamic duo decided that their own logical fallacy was probably symptomatic of the larger population (I mean...they were are some pretty smart dudes. If they admit falling into the conjunction fallacy, then none of us are safe). When they tested their hypothesis in a wider group, they found strong evidence that others also make the same mistake [2].


The S.T.E.M. Connection

The educational implications of the conjunction fallacy probably belong in the category of "critical thinking." We need to teach our students not to fall into the trap of thinking that data from a subpopulation is going to be more frequent than the pool from which the data are drawn. For example, the description of Linda fits closely with the stereotype that we might have about feminists. But it turns out that we should ignore that data when it conflicts with the parameters of a larger population. 

If you are teaching a class on probability, the "Linda Problem" is a good way to introduce the topic of conjunctive probabilities. After the students complete the task, you can then go into a lesson of explaining the mathematics behind probabilities that depend on one another. In other words, the lesson can focus on the importance of the word "and." As a heuristic, you can explain that each time you add the word and, the probabilities are multiplied together; and when decimals are multiplied together, the entire probability decreases. As a wrap-up to the lesson, you can circle back to the Linda Problem and have a discussion about how the mathematics applies to this scenario. Once we say Linda is a bank teller and a feminist, then the proportion of people who fit that scenario drops precipitously. 

Estimating the likelihood of an event or the frequency of a class (or subclass) is extremely important in assessing risk. It is used in financial estimation, medical reasoning, and in psychological experiments when defining a target population. Falling prey to the conjunction fallacy is easy, but if we remember to slow dow our thinking, we might be able to detect (and escape!) our error. 


Share and Enjoy!

Dr. Bob

Going Beyond the Information Given

[1] To verify my assumption, I tried to look through the US Census occupation data. It was more difficult than I initially thought because the Bureau of Labor Statistics lumps together architects and engineers. However, if we restrict our analyses strictly to architects (non-naval) and physical scientists (all other), then my assumption was correct (246,000 architects and 261,000 physical scientists). 

[2] Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review, 90(4), 293.

Thursday, May 18, 2017

Can You Either Confirm or Deny?: Confirmation Bias

Learning By Doing


Let's play a game. Unfortunately, I have to send you away from this page. But go play and come right back! 

Reflection Questions
  • So...how did you do? 
  • Did you figure out the rule that governs the sequence of numbers? 
  • What problem-solving strategies did you use? 
  • Was there something that you wish you would have done differently? 


The Two Flavors of Confirmation Bias

Informally, the confirmation bias is the tendency to seek evidence that is consistent with your beliefs. The more personal the beliefs, the stronger the bias. More formally, there are two parts to the definition. The first part is "searching for confirmatory evidence," and the second part is "selectively interpreting the data to fit with one's hypothesis."


Selective Search of Data: The Luminiferous Ether

The number generation game that you played at the beginning of this post is a good example of looking for evidence that conforms to your initial hypothesis [1]. It's a tricky puzzle, and an overwhelming majority of people submit triples that confirm their suspicions. If this describes you, then you are not alone.

Confirmation bias is not relegated to the psychological laboratory. It also operates in the real world. Scientists, for example, often have a vested and personal interest in seeing their hypotheses confirmed by their data. A classic example in the history of science is the search for evidence of the “luminiferous ether." Up until the 19th century, it was believed that this was the substance that carried light. Like sound, it was believed that light needed a medium through which to propagate. Finally, in 1887, Albert Michelson and Edward Morley conducted a famous experiment that conclusively disconfirmed the existence of the ether [2]. Before that experiment, there was a lot of effort invested in finding evidence for this mysterious ether.

Bottom line: The data are selectively collected and disconfirmatory evidence is deemed irrelevant.


Selective Interpretation of Data: The People v. O. J. Simpson 

The O. J. Simpson trial is a good example of selectively interpreting evidence to support your position or claim [3]. As in most trials, there was evidence that nobody can deny: blood at O. J.'s house contained the DNA of Nicole Brown Simpson. There was blood found in O. J.'s white Ford Bronco that matched both Nicole and Ron Goldman's DNA. O. J. Simpson had been arrested for physically assaulting Nicole. These are all incontrovertible facts. However, the defense and prosecution interpreted the data differently. The defense said that the blood samples were placed there by a racist LAPD cop. The defense claimed that the blood was not placed there, but was a result of the murders and subsequent coverup by O. J.

Bottom line: The data are right, but the interpretation of the data are subject to dispute.

The S.T.E.M. Connection

There are implications of the confirmation bias for the classroom as well. In the mid- to late-1960's, educational psychologists experimentally manipulated teachers' expectations of their students. They were told that certain students were about to experience a learning "spurt" (or not). They randomly selected kids to be in the "spurt" condition (or not). 

What did they find? They found that teacher expectations had a measurable impact on the number of IQ points the students gained over the course of an academic year. The effect was particularly strong for kids in first and second grade [4]. Although the authors did not provide a mechanism, we might expect that the confirmation bias was at work. Every time a child in the spurt condition did something notable, it confirmed that teacher's expectation. If the student failed to live up to her expectation, then you might imagine the teacher was able to explain away her behavior (e.g., she was just having a bad day).

Confirmation bias plagues us all, and it can be difficult to avoid. Given that, it is important to experience it first hand, receive feedback when it does happen, and practice looking for and interpreting evidence that goes against one's beliefs. Only then can we get a true picture of the world.  


Share and Enjoy!

Dr. Bob

Going Beyond the Information Given

[1] Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly journal of experimental psychology, 12(3), 129-140.

[2] Motta, L. (2007) Michelson-Morley experiment. Retrieved from http://scienceworld.wolfram.com/physics/Michelson-MorleyExperiment.html

[3] Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175.

[4] Rosenthal, R., & Jacobson, L. (1968) Pygmalion in the classroom. New York: Hold, Rinehart and Winston.