29
Thinking Fast & Slow Daniel Kahnemen & Amos Tversky

Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

  • Upload
    others

  • View
    14

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Thinking Fast & Slow Daniel Kahnemen & Amos Tversky

Page 2: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Thinking Fast & Slow Daniel Kahnemen & Amos Tversky

2002 Nobel Memorial Prize in Economic Sciences

Developed the idea that we have two systems of thinking,

one fast, intuitively & impressionistically (System 1)

and

… one slow, deliberately thoughtful and systematic

(System 2)

note: This is a metaphor… there are not two parts of the

brain … but rather two distinct ways of processing things.

Page 3: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

System 1

Our fast system, works almost automatically.

When we speak our native language, or determine where a sound is coming from, or do simple equations, or drive (if we are experienced), we are activating our System 1.

Any skill that we have mastered to the point where we do not have to think about can become a System 1 response.

Many very complex actions that we do, like walking or running, are things we learned over decades.

System 1 is incredibly efficient but is also prone to mistakes.

System 1 is also the root of terribly lazy thinking like making racial stereotypes

Page 4: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

System 2

Our slow system, is what we mean when we normally refer to “thinking” … judging and calculating

… it is when we are struggling to learn how to speak a foreign language, or drive a car for the first time or write a good essay (or a blog post).

System 2 is hard work.

In fact we are only capable of doing it for limited periods of time before we succumb to decision fatigue and resort back to System 1 decision making.

Thinking is hard.

“The attentive System 2 is who we think we are. System 2 articulates judgments and makes choices, but often endorses or rationalizes ideas and feelings that were generated by System 1”

Daniel Kahneman

Page 5: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Illustration

• Imagine you are walking alone along an alley towards your car in bright daylight.

• As you pass a side passageway, you are startled to see someone there walking casually in your direction.

• Besides the initial moment of surprise, your System 1 notices that the person is a well-groomed man wearing nice clothes, and you make an instant snap judgment (this would be heavily conditioned by culture, of course) that you are probably safe from danger, so your physiological responses to the surprise taper down.

• That was System 1 that both registered the surprise and made the snap judgment.

• Then, System 2 kicks in and you realize that it is not impossible for a well-dressed person to be a mugger, and that’s it better not to take chances when you are alone, so you decide to quicken your pace to separate yourself from the stranger.

Page 6: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

The difference…

System 1 was immediate and effortless, providing

initial conclusions.

Triggers associations, anchored by arbitrary information, and

takes things encountered at face value

“System 1 is a bit of a simpleton.”

System 2 brings more rational faculties to the situation.

System 2’s effectiveness can be impaired in various ways…

by distracting it with mental calculations, fatigue, alcohol and

exercising will-power.

“People who are cognitively busy are also more likely to make

selfish choices, use sexist language, and make superficial

judgments in social situations.”

Page 7: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Heuristics are mental rules of thumb that we

subconsciously employ when making decisions

Heuristic Definition

Commitment

Heuristic

“I’m already this far; I might as well keep on going.”

System 1 Heuristics

Affect Heuristic We “trust our gut”. If it feels good, it must be right.

Anchoring Heuristic We trust information that we just learned, even if it is

wrong or irrelevant. (Related: Availability heuristic –

information that we remember is more important than

other)

Representativeness

Heuristic

Different events that seem similar to us have a similar

likelihood of occurrence. (e.g. Lena believes in women’s

rights & Lena works at a bank. What is more likely? a)

Lena is a feminist bank teller, or b) Lena is a bank teller?

Most say the former even though that is impossible.

Page 8: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Heuristics are mental rules of thumb that we

subconsciously employ when making decisions

Heuristic Definition

Commitment

Heuristic

“I’m already this far; I might as well keep on going.”

System 1 Heuristics

Affect Heuristic We “trust our gut”. If it feels good, it must be right.

Anchoring Heuristic We trust information that we just learned, even if it is

wrong or irrelevant. (Related: Availability heuristic –

information that we remember is more important than

other)

Representativeness

Heuristic

Different events that seem similar to us have a similar

likelihood of occurrence. (e.g. Lena believes in women’s

rights & Lena works at a bank. What is more likely? a)

Lena is a feminist bank teller, or b) Lena is a bank teller?

Most say the former even though that is impossible.

Page 9: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Heuristics are mental rules of thumb that we

subconsciously employ when making decisions

Heuristic Definition

Commitment

Heuristic

“I’m already this far; I might as well keep on going.”

System 1 Heuristics

Affect Heuristic We “trust our gut”. If it feels good, it must be right.

Anchoring Heuristic We trust information that we just learned, even if it is

wrong or irrelevant. (Related: Availability heuristic –

information that we remember is more important than

other)

Representativeness

Heuristic

Different events that seem similar to us have a similar

likelihood of occurrence. (e.g. Lena believes in women’s

rights & Lena works at a bank. What is more likely? a)

Lena is a feminist bank teller, or b) Lena is a bank teller?

Most say the former even though that is impossible.

Page 10: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Heuristics are mental rules of thumb that we

subconsciously employ when making decisions

Heuristic Definition

Commitment

Heuristic

“I’m already this far; I might as well keep on going.”

System 1 Heuristics

Affect Heuristic We “trust our gut”. If it feels good, it must be right.

Anchoring Heuristic We trust information that we just learned, even if it is

wrong or irrelevant. (Related: Availability heuristic –

information that we remember is more important than

other)

Representativeness

Heuristic

Different events that seem similar to us have a similar

likelihood of occurrence. (e.g. Lena believes in women’s

rights & Lena works at a bank. What is more likely? a)

Lena is a feminist bank teller, or b) Lena is a bank teller?

Most say the former even though that is impossible.

Page 11: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Heuristics are mental rules of thumb that we

subconsciously employ when making decisions

Heuristic Definition

Commitment

Heuristic

“I’m already this far; I might as well keep on going.”

System 1 Heuristics

Affect Heuristic We “trust our gut”. If it feels good, it must be right.

Anchoring Heuristic We trust information that we just learned, even if it is

wrong or irrelevant. (Related: Availability heuristic –

information that we remember is more important than

other)

Representativeness

Heuristic

Different events that seem similar to us have a similar

likelihood of occurrence. (e.g. Lena believes in women’s

rights & Lena works at a bank. What is more likely? a)

Lena is a feminist bank teller, or b) Lena is a bank teller?

Most say the former even though that is impossible.

Page 12: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 13: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 14: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 15: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 16: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 17: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 18: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 19: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 20: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 21: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 22: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 23: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 24: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Bias Definition

Belief Bias Determining truth by how believable we personally find a conclusion

Confirmation Bias Interpreting new information to confirm our initial hypothesis or beliefs

Optimism Bias Overestimating favorable outcomes

Hindsight Bias Past events were as predictable at the time they happened as they are now.

Framing Effect How information is presented affects our opinions about it.

Loss Aversion Eliminating the risk of loss is better than increasing the odds of winning.

Narrative Fallacy Good stories are true stories!

Regression Fallacy Not taking into account the chance component of events.

Planning Fallacy Overestimate benefits & underestimates costs, how long will it take?

Halo Effect Existing judgments about a person are likely to extend to all aspects of them.

Small Numbers A small sample is representative of a much larger population.

WYSIWTI "What you see is what there is" (ignoring unknown information).

System 1 Cognitive Biases

System 1 Cognitive Biases

Page 25: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Prospect Theory

• The theory describes the decision processes in two stages: editing and evaluation.

• During editing, outcomes of a decision are ordered according to a certain heuristic.

– In particular, people decide which outcomes they consider equivalent, set a reference point and then consider lesser outcomes as losses and greater ones as gains.

– The editing phase aims to alleviate any cognitive bias

It also aims to resolve isolation effects stemming from individuals' propensity to often isolate consecutive probabilities instead of treating them together.

The subsequent evaluation phase, people behave as if they would compute a value (utility), based on the potential outcomes and their respective probabilities, and then choose the alternative having a higher utility.

Page 26: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Framing

• Participants were asked to choose between two

treatments for 600 people affected by a deadly

disease.

• Treatment A was predicted to result in 400 deaths,

whereas Treatment B had a 33% chance that no one

would die but a 66% chance that everyone would die.

• This choice was then presented to participants either

with positive framing, i.e. how many people would

live, or with negative framing, i.e. how many people

would die.

Page 27: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

• Treatment A was chosen by 72% of participants when

it was presented with positive framing ("saves 200

lives") dropping to only 22% when the same choice

was presented with negative framing ("400 people

will die").

• This effect has been shown in other contexts

Framing Treatment A Treatment B

Positive "Saves 200 lives"

"A 33% chance of saving

all 600 people, 66%

possibility of saving no

one."

Negative "400 people will die"

"A 33% chance that no

people will die, 66%

probability that all 600 will

die."

Page 28: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences
Page 29: Thinking Fast & Slowathena.ecs.csus.edu/~buckley/CSc233/Thinking_Fast_and_Slow.pdf · Thinking Fast & Slow Daniel Kahnemen & Amos Tversky 2002 Nobel Memorial Prize in Economic Sciences

Prospect Theory

Three claims

1. The curves flatten out as it goes out to the right or left

(incremental value diminishes)

2. The curve above zero is shorter than the one below zero:

losses loom larger than gains

3. The reference point is not $0 but some framed reference

point