19
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University 2 Slide © 2005 Thomson/South-Western Chapter 4 Introduction to Probability Experiments, Counting Rules, and Assigning Probabilities Events and Their Probability Some Basic Relationships of Probability Conditional Probability Bayes’ Theorem 3 Slide © 2005 Thomson/South-Western Probability as a Numerical Measure of the Likelihood of Occurrence 0 1 .5 Increasing Likelihood of Occurrence Probability: The event is very unlikely to occur. The occurrence of the event is just as likely as it is unlikely. The event is almost certain to occur.

© 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

  • Upload
    leanh

  • View
    226

  • Download
    0

Embed Size (px)

Citation preview

Page 1: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

1

1Slide© 2005 Thomson/South-Western

Slides Prepared byJOHN S. LOUCKS

St. Edward’s University

2Slide© 2005 Thomson/South-Western

Chapter 4Introduction to Probability

� Experiments, Counting Rules, and Assigning Probabilities

� Events and Their Probability� Some Basic Relationships

of Probability� Conditional Probability� Bayes’ Theorem

3Slide© 2005 Thomson/South-Western

Probability as a Numerical Measureof the Likelihood of Occurrence

0 1.5

Increasing Likelihood of Occurrence

Probability:

The eventis very

unlikelyto occur.

The occurrenceof the event isjust as likely asit is unlikely.

The eventis almostcertain

to occur.

Page 2: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

2

4Slide© 2005 Thomson/South-Western

An Experiment and Its Sample Space

An experiment is any process that generateswell-defined outcomes.

The sample space for an experiment is the set ofall experimental outcomes.

An experimental outcome is also called a samplepoint.

5Slide© 2005 Thomson/South-Western

Example: Bradley Investments

Bradley has invested in two stocks, Markley Oil and Collins Mining. Bradley has determined that thepossible outcomes of these investments three monthsfrom now are as follows.

Investment Gain or Lossin 3 Months (in $000)

Markley Oil Collins Mining1050

−20

8−2

6Slide© 2005 Thomson/South-Western

A Counting Rule for Multiple-Step Experiments

If an experiment consists of a sequence of k stepsin which there are n1 possible results for the first step,n2 possible results for the second step, and so on, then the total number of experimental outcomes isgiven by (n1)(n2) . . . (nk).

A helpful graphical representation of a multiple-stepexperiment is a tree diagram.

Page 3: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

3

7Slide© 2005 Thomson/South-Western

Bradley Investments can be viewed as atwo-step experiment. It involves two stocks, eachwith a set of experimental outcomes.

Markley Oil: n1 = 4Collins Mining: n2 = 2Total Number of Experimental Outcomes: n1n2 = (4)(2) = 8

A Counting Rule for Multiple-Step Experiments

8Slide© 2005 Thomson/South-Western

Tree Diagram

Gain 5

Gain 8

Gain 8

Gain 10

Gain 8

Gain 8

Lose 20

Lose 2

Lose 2

Lose 2

Lose 2

Even

Markley Oil(Stage 1)

Collins Mining(Stage 2)

ExperimentalOutcomes

(10, 8) Gain $18,000

(10, -2) Gain $8,000

(5, 8) Gain $13,000

(5, -2) Gain $3,000

(0, 8) Gain $8,000

(0, -2) Lose $2,000

(-20, 8) Lose $12,000

(-20, -2) Lose $22,000

9Slide© 2005 Thomson/South-Western

A second useful counting rule enables us to count thenumber of experimental outcomes when n objects are tobe selected from a set of N objects.

Counting Rule for Combinations

CNn

Nn N nn

N =⎛⎝⎜

⎞⎠⎟

=−!

!( )!C

Nn

Nn N nn

N =⎛⎝⎜

⎞⎠⎟

=−!

!( )!

Number of Combinations of N Objects Taken n at a Time

where: N! = N(N − 1)(N − 2) . . . (2)(1)n! = n(n − 1)(n − 2) . . . (2)(1)0! = 1

Page 4: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

4

10Slide© 2005 Thomson/South-Western

Number of Permutations of N Objects Taken n at a Time

where: N! = N(N − 1)(N − 2) . . . (2)(1)n! = n(n − 1)(n − 2) . . . (2)(1)0! = 1

P nNn

NN nn

N =⎛⎝⎜

⎞⎠⎟

=−

! !( )!

P nNn

NN nn

N =⎛⎝⎜

⎞⎠⎟

=−

! !( )!

Counting Rule for Permutations

A third useful counting rule enables us to count thenumber of experimental outcomes when n objects are tobe selected from a set of N objects, where the order ofselection is important.

11Slide© 2005 Thomson/South-Western

Assigning Probabilities

Classical Method

Relative Frequency Method

Subjective Method

Assigning probabilities based on the assumptionof equally likely outcomes

Assigning probabilities based on experimentationor historical data

Assigning probabilities based on judgment

12Slide© 2005 Thomson/South-Western

Classical Method

If an experiment has n possible outcomes, this method would assign a probability of 1/n to each outcome.

Experiment: Rolling a dieSample Space: S = {1, 2, 3, 4, 5, 6}Probabilities: Each sample point has a

1/6 chance of occurring

Example

Page 5: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

5

13Slide© 2005 Thomson/South-Western

Relative Frequency Method

Number ofPolishers Rented

Numberof Days

01234

46

18102

Lucas Tool Rental would like to assignprobabilities to the number of car polishersit rents each day. Office records show the followingfrequencies of daily rentals for the last 40 days.

Example: Lucas Tool Rental

14Slide© 2005 Thomson/South-Western

Each probability assignment is given bydividing the frequency (number of days) bythe total frequency (total number of days).

Relative Frequency Method

4/40

ProbabilityNumber of

Polishers RentedNumberof Days

01234

46

18102

40

.10

.15

.45

.25

.051.00

15Slide© 2005 Thomson/South-Western

Subjective Method

When economic conditions and a company’scircumstances change rapidly it might beinappropriate to assign probabilities based solely onhistorical data.We can use any data available as well as ourexperience and intuition, but ultimately a probabilityvalue should express our degree of belief that theexperimental outcome will occur.

The best probability estimates often are obtained bycombining the estimates from the classical or relativefrequency approach with the subjective estimate.

Page 6: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

6

16Slide© 2005 Thomson/South-Western

Subjective Method

Applying the subjective method, an analyst made the following probability assignments.

Exper. Outcome Net Gain or Loss Probability(10, 8)(10, −2)(5, 8)(5, −2)(0, 8)(0, −2)(−20, 8)(−20, −2)

$18,000 Gain$8,000 Gain

$13,000 Gain$3,000 Gain$8,000 Gain$2,000 Loss

$12,000 Loss$22,000 Loss

.20

.08

.16

.26

.10

.12

.02

.06

17Slide© 2005 Thomson/South-Western

An event is a collection of sample points.

The probability of any event is equal to the sum ofthe probabilities of the sample points in the event.

If we can identify all the sample points of anexperiment and assign a probability to each, wecan compute the probability of an event.

Events and Their Probabilities

18Slide© 2005 Thomson/South-Western

Events and Their Probabilities

Event M = Markley Oil ProfitableM = {(10, 8), (10, −2), (5, 8), (5, −2)}

P(M) = P(10, 8) + P(10, −2) + P(5, 8) + P(5, −2)= .20 + .08 + .16 + .26= .70

Page 7: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

7

19Slide© 2005 Thomson/South-Western

Events and Their Probabilities

Event C = Collins Mining ProfitableC = {(10, 8), (5, 8), (0, 8), (−20, 8)}

P(C) = P(10, 8) + P(5, 8) + P(0, 8) + P(−20, 8)= .20 + .16 + .10 + .02= .48

20Slide© 2005 Thomson/South-Western

Some Basic Relationships of Probability

There are some basic probability relationships thatcan be used to compute the probability of an eventwithout knowledge of all the sample point probabilities.

Complement of an Event

Intersection of Two Events

Mutually Exclusive Events

Union of Two Events

21Slide© 2005 Thomson/South-Western

The complement of A is denoted by Ac.

The complement of event A is defined to be the eventconsisting of all sample points that are not in A.

Complement of an Event

Event A AcSampleSpace S

VennDiagram

Page 8: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

8

22Slide© 2005 Thomson/South-Western

The union of events A and B is denoted by A ∪ B.

The union of events A and B is the event containingall sample points that are in A or B or both.

Union of Two Events

SampleSpace SEvent A Event B

23Slide© 2005 Thomson/South-Western

Union of Two Events

Event M = Markley Oil ProfitableEvent C = Collins Mining Profitable

M ∪ C = Markley Oil Profitable or Collins Mining Profitable

M ∪ C = {(10, 8), (10, −2), (5, 8), (5, −2), (0, 8), (−20, 8)}P(M ∪ C) = P(10, 8) + P(10, −2) + P(5, 8) + P(5, −2)

+ P(0, 8) + P(−20, 8)= .20 + .08 + .16 + .26 + .10 + .02= .82

24Slide© 2005 Thomson/South-Western

The intersection of events A and B is denoted by A ∩ Β.

The intersection of events A and B is the set of allsample points that are in both A and B.

SampleSpace SEvent A Event B

Intersection of Two Events

Intersection of A and B

Page 9: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

9

25Slide© 2005 Thomson/South-Western

Intersection of Two Events

Event M = Markley Oil ProfitableEvent C = Collins Mining ProfitableM ∩ C = Markley Oil Profitable

and Collins Mining ProfitableM ∩ C = {(10, 8), (5, 8)}

P(M ∩ C) = P(10, 8) + P(5, 8)

= .20 + .16

= .36

26Slide© 2005 Thomson/South-Western

The addition law provides a way to compute theprobability of event A, or B, or both A and B occurring.

Addition Law

The law is written as:

P(A ∪ B) = P(A) + P(B) − P(A ∩ B)

27Slide© 2005 Thomson/South-Western

Event M = Markley Oil ProfitableEvent C = Collins Mining Profitable

M ∪ C = Markley Oil Profitable or Collins Mining Profitable

We know: P(M) = .70, P(C) = .48, P(M ∩ C) = .36Thus: P(M ∪ C) = P(M) + P(C) − P(M ∩ C)

= .70 + .48 − .36= .82

Addition Law

(This result is the same as that obtained earlierusing the definition of the probability of an event.)

Page 10: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

10

28Slide© 2005 Thomson/South-Western

Mutually Exclusive Events

Two events are said to be mutually exclusive if theevents have no sample points in common.

Two events are mutually exclusive if, when one eventoccurs, the other cannot occur.

SampleSpace SEvent A Event B

29Slide© 2005 Thomson/South-Western

Mutually Exclusive Events

If events A and B are mutually exclusive, P(A ∩ B) = 0.

The addition law for mutually exclusive events is:

P(A ∪ B) = P(A) + P(B)

there’s no need toinclude “− P(A ∩ B)”

30Slide© 2005 Thomson/South-Western

The probability of an event given that another eventhas occurred is called a conditional probability.

A conditional probability is computed as follows :

The conditional probability of A given B is denotedby P(A|B).

Conditional Probability

( )( | )( )

P A BP A BP B

∩=

( )( | )( )

P A BP A BP B

∩=

Page 11: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

11

31Slide© 2005 Thomson/South-Western

Event M = Markley Oil ProfitableEvent C = Collins Mining Profitable

We know: P(M ∩ C) = .36, P(M) = .70

Thus:

Conditional Probability

( ) .36( | ) .5143( ) .70

P C MP C MP M

∩= = =

( ) .36( | ) .5143( ) .70

P C MP C MP M

∩= = =

= Collins Mining Profitablegiven Markley Oil Profitable

( | )P C M( | )P C M

32Slide© 2005 Thomson/South-Western

Multiplication Law

The multiplication law provides a way to compute theprobability of the intersection of two events.

The law is written as:

P(A ∩ B) = P(B)P(A|B)

33Slide© 2005 Thomson/South-Western

Event M = Markley Oil ProfitableEvent C = Collins Mining Profitable

We know: P(M) = .70, P(C|M) = .5143

Multiplication Law

M ∩ C = Markley Oil Profitableand Collins Mining Profitable

Thus: P(M ∩ C) = P(M)P(M|C)= (.70)(.5143)= .36

(This result is the same as that obtained earlierusing the definition of the probability of an event.)

Page 12: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

12

34Slide© 2005 Thomson/South-Western

Independent Events

If the probability of event A is not changed by theexistence of event B, we would say that events Aand B are independent.

Two events A and B are independent if:

P(A|B) = P(A) P(B|A) = P(B)or

35Slide© 2005 Thomson/South-Western

The multiplication law also can be used as a test to seeif two events are independent.

The law is written as:

P(A ∩ B) = P(A)P(B)

Multiplication Lawfor Independent Events

36Slide© 2005 Thomson/South-Western

Multiplication Lawfor Independent Events

Event M = Markley Oil ProfitableEvent C = Collins Mining Profitable

We know: P(M ∩ C) = .36, P(M) = .70, P(C) = .48But: P(M)P(C) = (.70)(.48) = .34, not .36

Are events M and C independent?Does P(M ∩ C) = P(M)P(C) ?

Hence: M and C are not independent.

Page 13: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

13

37Slide© 2005 Thomson/South-Western

Bayes’ Theorem

NewInformation

Applicationof Bayes’Theorem

PosteriorProbabilities

PriorProbabilities

Often we begin probability analysis with initial orprior probabilities.Then, from a sample, special report, or a producttest we obtain some additional information.Given this information, we calculate revised orposterior probabilities.

Bayes’ theorem provides the means for revising theprior probabilities.

38Slide© 2005 Thomson/South-Western

A proposed shopping center will provide strong competitionfor downtown businesses likeL. S. Clothiers. If the shoppingcenter is built, the owner of L. S. Clothiers feels it would be best torelocate to the center.

Bayes’ Theorem

Example: L. S. Clothiers

The shopping center cannot be built unless azoning change is approved by the town council. Theplanning board must first make a recommendation, foror against the zoning change, to the council.

39Slide© 2005 Thomson/South-Western

� Prior ProbabilitiesLet:

Bayes’ Theorem

A1 = town council approves the zoning changeA2 = town council disapproves the change

P(A1) = .7, P(A2) = .3

Using subjective judgment:

Page 14: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

14

40Slide© 2005 Thomson/South-Western

� New InformationThe planning board has recommended against the

zoning change. Let B denote the event of a negative recommendation by the planning board.

Given that B has occurred, should L. S. Clothiers revise the probabilities that the town council will approve or disapprove the zoning change?

Bayes’ Theorem

41Slide© 2005 Thomson/South-Western

� Conditional ProbabilitiesPast history with the planning board and the

town council indicates the following:

Bayes’ Theorem

P(B|A1) = .2 P(B|A2) = .9

P(BC|A1) = .8 P(BC|A2) = .1Hence:

42Slide© 2005 Thomson/South-Western

P(Bc|A1) = .8P(A1) = .7

P(A2) = .3P(B|A2) = .9

P(Bc|A2) = .1

P(B|A1) = .2 P(A1 ∩ B) = .14

P(A2 ∩ B) = .27

P(A2 ∩ Bc) = .03

P(A1 ∩ Bc) = .56

Bayes’ Theorem

Tree Diagram

Town Council Planning Board ExperimentalOutcomes

Page 15: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

15

43Slide© 2005 Thomson/South-Western

Bayes’ Theorem

1 1 2 2

( ) ( | )( | )( ) ( | ) ( ) ( | ) ... ( ) ( | )

i ii

n n

P A P B AP A BP A P B A P A P B A P A P B A

=+ + +1 1 2 2

( ) ( | )( | )( ) ( | ) ( ) ( | ) ... ( ) ( | )

i ii

n n

P A P B AP A BP A P B A P A P B A P A P B A

=+ + +

To find the posterior probability that event Ai willoccur given that event B has occurred, we applyBayes’ theorem.

Bayes’ theorem is applicable when the events forwhich we want to compute posterior probabilitiesare mutually exclusive and their union is the entiresample space.

44Slide© 2005 Thomson/South-Western

� Posterior ProbabilitiesGiven the planning board’s recommendation not

to approve the zoning change, we revise the prior probabilities as follows:

1 11

1 1 2 2

( ) ( | )( | )( ) ( | ) ( ) ( | )

P A P B AP A BP A P B A P A P B A

=+

1 11

1 1 2 2

( ) ( | )( | )( ) ( | ) ( ) ( | )

P A P B AP A BP A P B A P A P B A

=+

=+

(. )(. )(. )(. ) (. )(. )

7 27 2 3 9

=+

(. )(. )(. )(. ) (. )(. )

7 27 2 3 9

Bayes’ Theorem

= .34

45Slide© 2005 Thomson/South-Western

� ConclusionThe planning board’s recommendation is good

news for L. S. Clothiers. The posterior probability of the town council approving the zoning change is .34 compared to a prior probability of .70.

Bayes’ Theorem

Page 16: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

16

46Slide© 2005 Thomson/South-Western

Tabular Approach

� Step 1Prepare the following three columns:Column 1 − The mutually exclusive events for whichposterior probabilities are desired.Column 2 − The prior probabilities for the events.Column 3 − The conditional probabilities of the newinformation given each event.

47Slide© 2005 Thomson/South-Western

Tabular Approach

(1) (2) (3) (4) (5)

EventsAi

PriorProbabilities

P(Ai)

ConditionalProbabilities

P(B|Ai)

A1

A2

.7

.31.0

.2

.9

48Slide© 2005 Thomson/South-Western

Tabular Approach

� Step 2Column 4

Compute the joint probabilities for each event and the new information B by using the multiplication law.

Multiply the prior probabilities in column 2 by the corresponding conditional probabilities in column 3. That is, P(Ai ∩B) = P(Ai) P(B|Ai).

Page 17: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

17

49Slide© 2005 Thomson/South-Western

Tabular Approach

(1) (2) (3) (4) (5)

EventsAi

PriorProbabilities

P(Ai)

ConditionalProbabilities

P(B|Ai)

A1

A2

.7

.31.0

.2

.9.14.27

JointProbabilities

P(Ai ∩B)

.7 x .2

50Slide© 2005 Thomson/South-Western

Tabular Approach

� Step 2 (continued)We see that there is a .14 probability of the town

council approving the zoning change and a negativerecommendation by the planning board.

There is a .27 probability of the town councildisapproving the zoning change and a negativerecommendation by the planning board.

51Slide© 2005 Thomson/South-Western

Tabular Approach

� Step 3Column 4

Sum the joint probabilities. The sum is theprobability of the new information, P(B). The sum.14 + .27 shows an overall probability of .41 of anegative recommendation by the planning board.

Page 18: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

18

52Slide© 2005 Thomson/South-Western

Tabular Approach

(1) (2) (3) (4) (5)

EventsAi

PriorProbabilities

P(Ai)

ConditionalProbabilities

P(B|Ai)

A1

A2

.7

.31.0

.2

.9.14.27

JointProbabilities

P(Ai ∩B)

P(B) = .41

53Slide© 2005 Thomson/South-Western

� Step 4Column 5

Compute the posterior probabilities using the basic relationship of conditional probability.

The joint probabilities P(Ai ∩B) are in column 4 and the probability P(B) is the sum of column 4.

Tabular Approach

)()()|(

BPBAPBAP i

i∩

=)(

)()|(BP

BAPBAP ii

∩=

54Slide© 2005 Thomson/South-Western

(1) (2) (3) (4) (5)

EventsAi

PriorProbabilities

P(Ai)

ConditionalProbabilities

P(B|Ai)

A1

A2

.7

.31.0

.2

.9.14.27

JointProbabilities

P(Ai ∩B)

P(B) = .41

Tabular Approach

.14/.41

PosteriorProbabilities

P(Ai |B)

.3415

.65851.0000

Page 19: © 2005 Thomson/South-Western Slide · 1 © 2005 Thomson/South-Western Slide 1 Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2005 Thomson/South-Western Slide 2 Chapter

19

55Slide© 2005 Thomson/South-Western

End of Chapter 4