Showing posts with label Heuristics. Show all posts
Showing posts with label Heuristics. Show all posts

Monday, June 10, 2019

Reading Room Material: Misbehaving

Your Tank is Empty

You glance down at your gas gauge. You're coasting on fumes. Time to fill 'er up! There are two gas stations across the street from each other. Which gas station would you choose? 

Gas Station A: The price for gas is $2.45 if you pay with cash, and a $0.05 surcharge for using a credit card.

Gas Station BThe price for gas is $2.50 for using a credit card, and a $0.05 discount for paying with cash. 

What is "Behavioral Economics?"

I'm not sure if this is apocryphal or not, but I believe Hitler famously asked, "Guns or butter?" [1] At first glance, his statement seems like a false dichotomy. But if you dig a little deeper, you find that the common denominator is nitrogen. You can either turn nitrates into gun powder or you can convert them into fertilizer [2].

What does "guns and butter" have to do with Economics? Simply put, Economics is the scientific examination of making decisions when resources are scarce. In a world of finite resources, you have to pick. Do you want guns and tanks, or do you want plows and tractors? You can't have both, so you have to choose.

Where, then, does the "behavioral" aspect of behavioral economics come into play? I'm glad you asked because Dr. Richard Thaler, author of the book Misbehaving, has a terrific answer [3].


What's the Big Idea? 

Before delving into Dr. Thaler's big idea, a little context is necessary. Every scientific theory is based on a few core assumptions. Economics is no different. Neoclassical Economics assumes that people are rational and are able to make optimal decisions to further their own agenda. In other words, I know what makes my life worth living. It might be different from yours, but we both are able to decide what's best for ourselves. In economic jargon, we are experts at maximizing our own utility (read: "happiness"). For simplicity, Thaler refers to these rational types of people as Econs.

Assuming people are rational, various factors are deemed irrelevant when making decisions. In the scenario that opened this post, both options should be equally appealing to an EconIf we observed Econs filling up their cars, about half of them would choose Station A and the other half would choose Station B. 

But what do we observe when we watch Humans purchase gas? Which gas station would you purchase gas from? Gas stations that offered a surcharge on credit cards quickly learn that humans overwhelmingly prefer discounts.

Thaler's book is chocked full of big ideas. Probably the biggest among them is the idea that economic theory is based on the faulty assumption that we are Econs instead of Humans. Unfortunately for Neoclassical Economics, humans are swayed by supposedly irrelevant factors, or "SIFs" for short.


The Big SIF: The Sunk Cost Fallacy

What's an example of a SIF? Personally, one of the hardest lessons for me to learn was to ignore "sunk costs." I bet this has happened to you, so I'm sure you can sympathize. A couple of years ago, I bought a ticket to a brew festival. The lineup of craft brewers was incredible, and I was excited to try some new libations. But on the day of the beer fest, I came down with fever. I tried to coax myself to go. I couldn't let all that money go to waste! But then a lesson from college came back to haunt me. That money is gone. I can't jeopardize my health because that would be considered, "throwing good money after bad." In the end, I had to stay home (and ignore the sunk cost of my ticket!). 

The sunk cost fallacy is just one example of many SIFs that Dr. Thaler describes and had a hand in discovering. By the end of the book, you are left wondering how Economist can continue to believe in a rational individual. More importantly, How must economic theory change to accommodate these SIFs? I guess it's up to us misbehaving Humans to figure it out. 


Share and Enjoy!

Dr. Bob

More Material

[1] According to the all-knowing wikipedia, it looks like Hitler didn't coin the term, but the Nazis did use the concept of "guns or butter" in their propaganda. 

[2] Making decisions on a stingy planet also calls to mind our discussion on Opportunity Costs.

[3] Thaler, R. H. (2015). Misbehaving: The making of behavioral economics. WW Norton & Company.

Thursday, December 21, 2017

Joined At the Hip: The Conjunction Fallacy

Learning By Doing

Consider the following scenario: 

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
  1. Linda is active in the feminist movement.
  2. Linda is a bank teller.
  3. Linda is a bank teller and is active in the feminist movement.
Your task is to order the three descriptions from most likely to least likely. In other words, is Linda more likely to be a feminist than a bank teller (or vice versa)? Once you have rank-ordered the descriptions, feel free to read on.


Scientists on a Plane

After attending a conference, I was on flight home when a woman with poster tube sat down next to me (I had the middle seat...lucky me). Curious, I asked if she was a scientist or an architect? Her reply took me by surprise. "I'm a scientist. Don't I look like one?" I feebly tried to explain that my guess wasn't based on looks. Instead, I was trying to use information about the base rates of scientists in the US. I assumed there are more scientists than architects [1]. 

I think her question highlights a very common way of thinking. It is extremely easy to ignore the frequency of a particular class or the probability of the occurrence of a specific event. In other words, it is easy to forget about base rates (e.g., the number of female social scientists employed in the US) when there is a concrete example immediately in front of you.

Let's return to the question that opened this post. I haven't taken the time to investigate how many feminists there are in the United States, nor have I tried to mine the US Census data to figure out the number of employed bank tellers. But I know for certain that there are fewer feminist bank tellers than there are bank tellers or feminists. How do I know that? Because it is logically impossible to have more feminist bank tellers than either subgroup. The following Venn diagram proves this logical necessity (see Fig. 1).


Figure 1. A Venn diagram depicting feminist bank tellers.


"We are connected/Siamese twins/At the wrist" –Smashing Pumpkins, Geek U.S.A.

Forgetting that overlapping populations are less frequent than their parent populations goes by the term Conjunction Fallacy. This mental slip was first introduced by Amos Tversky and Daniel Kahneman in the early 80s. They noticed that they themselves would initially make this error; however, after they had sufficient time to think about it, they realized their mistake. This dynamic duo decided that their own logical fallacy was probably symptomatic of the larger population (I mean...they were are some pretty smart dudes. If they admit falling into the conjunction fallacy, then none of us are safe). When they tested their hypothesis in a wider group, they found strong evidence that others also make the same mistake [2].


The S.T.E.M. Connection

The educational implications of the conjunction fallacy probably belong in the category of "critical thinking." We need to teach our students not to fall into the trap of thinking that data from a subpopulation is going to be more frequent than the pool from which the data are drawn. For example, the description of Linda fits closely with the stereotype that we might have about feminists. But it turns out that we should ignore that data when it conflicts with the parameters of a larger population. 

If you are teaching a class on probability, the "Linda Problem" is a good way to introduce the topic of conjunctive probabilities. After the students complete the task, you can then go into a lesson of explaining the mathematics behind probabilities that depend on one another. In other words, the lesson can focus on the importance of the word "and." As a heuristic, you can explain that each time you add the word and, the probabilities are multiplied together; and when decimals are multiplied together, the entire probability decreases. As a wrap-up to the lesson, you can circle back to the Linda Problem and have a discussion about how the mathematics applies to this scenario. Once we say Linda is a bank teller and a feminist, then the proportion of people who fit that scenario drops precipitously. 

Estimating the likelihood of an event or the frequency of a class (or subclass) is extremely important in assessing risk. It is used in financial estimation, medical reasoning, and in psychological experiments when defining a target population. Falling prey to the conjunction fallacy is easy, but if we remember to slow dow our thinking, we might be able to detect (and escape!) our error. 


Share and Enjoy!

Dr. Bob

Going Beyond the Information Given

[1] To verify my assumption, I tried to look through the US Census occupation data. It was more difficult than I initially thought because the Bureau of Labor Statistics lumps together architects and engineers. However, if we restrict our analyses strictly to architects (non-naval) and physical scientists (all other), then my assumption was correct (246,000 architects and 261,000 physical scientists). 

[2] Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review, 90(4), 293.

Thursday, May 18, 2017

Can You Either Confirm or Deny?: Confirmation Bias

Learning By Doing


Let's play a game. Unfortunately, I have to send you away from this page. But go play and come right back! 

Reflection Questions
  • So...how did you do? 
  • Did you figure out the rule that governs the sequence of numbers? 
  • What problem-solving strategies did you use? 
  • Was there something that you wish you would have done differently? 


The Two Flavors of Confirmation Bias

Informally, the confirmation bias is the tendency to seek evidence that is consistent with your beliefs. The more personal the beliefs, the stronger the bias. More formally, there are two parts to the definition. The first part is "searching for confirmatory evidence," and the second part is "selectively interpreting the data to fit with one's hypothesis."


Selective Search of Data: The Luminiferous Ether

The number generation game that you played at the beginning of this post is a good example of looking for evidence that conforms to your initial hypothesis [1]. It's a tricky puzzle, and an overwhelming majority of people submit triples that confirm their suspicions. If this describes you, then you are not alone.

Confirmation bias is not relegated to the psychological laboratory. It also operates in the real world. Scientists, for example, often have a vested and personal interest in seeing their hypotheses confirmed by their data. A classic example in the history of science is the search for evidence of the “luminiferous ether." Up until the 19th century, it was believed that this was the substance that carried light. Like sound, it was believed that light needed a medium through which to propagate. Finally, in 1887, Albert Michelson and Edward Morley conducted a famous experiment that conclusively disconfirmed the existence of the ether [2]. Before that experiment, there was a lot of effort invested in finding evidence for this mysterious ether.

Bottom line: The data are selectively collected and disconfirmatory evidence is deemed irrelevant.


Selective Interpretation of Data: The People v. O. J. Simpson 

The O. J. Simpson trial is a good example of selectively interpreting evidence to support your position or claim [3]. As in most trials, there was evidence that nobody can deny: blood at O. J.'s house contained the DNA of Nicole Brown Simpson. There was blood found in O. J.'s white Ford Bronco that matched both Nicole and Ron Goldman's DNA. O. J. Simpson had been arrested for physically assaulting Nicole. These are all incontrovertible facts. However, the defense and prosecution interpreted the data differently. The defense said that the blood samples were placed there by a racist LAPD cop. The defense claimed that the blood was not placed there, but was a result of the murders and subsequent coverup by O. J.

Bottom line: The data are right, but the interpretation of the data are subject to dispute.

The S.T.E.M. Connection

There are implications of the confirmation bias for the classroom as well. In the mid- to late-1960's, educational psychologists experimentally manipulated teachers' expectations of their students. They were told that certain students were about to experience a learning "spurt" (or not). They randomly selected kids to be in the "spurt" condition (or not). 

What did they find? They found that teacher expectations had a measurable impact on the number of IQ points the students gained over the course of an academic year. The effect was particularly strong for kids in first and second grade [4]. Although the authors did not provide a mechanism, we might expect that the confirmation bias was at work. Every time a child in the spurt condition did something notable, it confirmed that teacher's expectation. If the student failed to live up to her expectation, then you might imagine the teacher was able to explain away her behavior (e.g., she was just having a bad day).

Confirmation bias plagues us all, and it can be difficult to avoid. Given that, it is important to experience it first hand, receive feedback when it does happen, and practice looking for and interpreting evidence that goes against one's beliefs. Only then can we get a true picture of the world.  


Share and Enjoy!

Dr. Bob

Going Beyond the Information Given

[1] Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly journal of experimental psychology, 12(3), 129-140.

[2] Motta, L. (2007) Michelson-Morley experiment. Retrieved from http://scienceworld.wolfram.com/physics/Michelson-MorleyExperiment.html

[3] Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175.

[4] Rosenthal, R., & Jacobson, L. (1968) Pygmalion in the classroom. New York: Hold, Rinehart and Winston.

Thursday, May 28, 2015

To the Ends of the Hills: Problem-solving Heuristics

Suppose you are are a farmer, and you are transporting your goods to market. You have a fox, a chicken, and some grain. You come across a river and a boat that holds you and one other item. If you leave the fox with the chicken, he will eat the chicken. If you leave the chicken with the grain, she will eat the grain. How can you safely ferry all of your goods across the river?




Algorithms v. Heuristics

In a previous post, we drew a distinction between routine and insight problem solving. Routine problems are nice because we see them all the time. Because of that, we have an available procedure that can be easily deployed. Insight problems, however, are more stubborn because we don't have a ready-made strategy. Instead, we need to invent or apply creative approaches to the problem.

In addition to distinguishing between types of problems, a distinction can also be made for the problem-solving process. On one hand, we have algorithms, which are problem-solving procedures that guarantee an answer. The start state is well defined, and you also have a set of operators that transform the problem from one state to another. You apply the operators in a prescribed order until the solution is generated. It might take some time, but you will eventually arrive at a solution. Long division is an example of an algorithm because it is a step-by-step procedure that will guarantee a solution. Unfortunately, that algorithm only works for dividing numbers. You can't use it to pick an outfit for work. 

Contrast an algorithm with a heuristic, which trades off a guaranteed solution with its broad applicability. An algorithm only works if one exists, and it is problem specific. Heuristics, on the other hand, apply to a broad range of problems. The tradeoff is that a heuristic might not give you an answer (or it might provide a sub-optimal solution). Let's take a look at two problem-solving heuristics. 


"Trying to get up that great big hill (of hope)" -- 4 Non Blondes

The first heuristic is called hill climbing (or difference reduction) [1]. It's called "hill climbing" because you can envision the problem space as a mountain. At the base is the starting point (or initial state). The top of the mountain is the solution (or goal state). The hill climbing heuristic selects an operator that reduces the difference between the current state and the eventual goal state. 

To make this more concrete, let's apply the hill climbing heuristic to the farmer's dilemma that opened this post. I don't have an algorithm for solving this problem because I've never seen it before [2]. Before I begin, there is only one problem-solving operator that can change the problem state, and that is loading something onto the boat and moving it to the other side of the river. There are two constraints. I can't leave the fox and chicken alone, and I can't leave the chicken and grain alone. 

To apply the hill-climbing heuristic, I need to select an object. The chicken is the only option because the fox isn't interested in the grain. In terms of climbing the hill, it gets me one step closer to the top. I go back across the river, and I now have to select something else. It doesn't seem to matter which object I choose, so I select at random. I pick the grain. I move that over to the other side of the river, and I am one step closer to the solution. Warning: Here comes the hard part. I have to take something back across the river because it will violate one of the problem-solving constraints (e.g., leaving the chicken with the delicious grain). This goes against climbing the hill because I have to take a step backwards, away from the goal. 

As you can see, hill-climbing might not guarantee a solution. The reason this puzzle is potentially difficult is precisely because you have to make a move that gets you further away from the goal state. 


Means-ends Analysis: It's a Means...to an End!

The second problem-solving heuristic is called means-ends analysis, and it attempts to solve a larger problem (or goal) by breaking it down into smaller sub-problems (or subgoals). Suppose my goal is to drive to work. But when I try to start my car, it fails to turn over. Now I have a problem: How do I get to work? I could call a coworker, but I don't remember her number. Thus, I have to set another subgoal to find her number and give her a call.

Let's take another example. Means-ends analysis works really well for the 3-disk version of the Tower of Hanoi. Here is the initial state:



My top-level goal is to get all the disks on the right most peg. Since that currently isn't possible, I set a subgoal to move the blue disk. But the purple disk is on top of it, so I set another subgoal to move the purple disk out of the way. But the purple disk is blocked by the red disk, so I set a subgoal to move the red disk to a different peg.


As you can see, I now have a bunch of sub-goals hanging around. It's hard to keep track of them because my working memory is severely constrained. Thus, the more disks I have, the more subgoals I collect, which adds an additional burden to working memory.



The STEM Connection

Both math and science education might benefit from knowing about the different problem-solving heuristics. Let's consider science first. One of the top-level goals of science is to build an explanation for some observable phenomenon. If we use the language developed here, the top-level goal is to construct a model or an explanation. The research question or the hypothesis is the problem to be solved. That problem can be decomposed into smaller problems or subgoals. Suppose I want to measure the distance a bee flies after leaving the hive. Thus, I set a subgoal to figure out how to track individual bees. That measurement problem opens several other interesting subgoals.

For math education, it might be useful for students to know the distinction between an algorithm and a heuristic. Some math problems are encountered so frequently that the field of mathematics has developed an algorithm that can be learned and executed whenever the conditions of the problem match the algorithm. But other math problems might not have a ready-made solution (e.g., How many pounds of trash are generated by New York city in a day?). When students encounter these types of questions, then it is time to find a problem-solving heuristic that drives toward a solution. Finding a heuristic, one might say, becomes the first subgoal in finding a solution! 


Share and Enjoy!

Dr. Bob

For More Information

[1] I consulted John Anderson's very approachable textbook Cognitive Psychology and its Implications for the description of the difference reduction and means-ends analysis heuristics. I highly recommend picking up a copy of this book.

[2] Actually, that's not 100% accurate. The fox, chicken, and grain problem is eerily reminiscent of the Hobbits and Orcs problem that we encountered in a previous post.