22
Decision Analysis

Decision Analysis. What is Decision Analysis? The process of arriving at an optimal strategy given: –Multiple decision alternatives –Uncertain future

Embed Size (px)

Citation preview

Decision Analysis

What is Decision Analysis?

• The process of arriving at an optimal strategy given:– Multiple decision alternatives– Uncertain future events (chance events)– The consequences associated with each decision

alternative and each chance event.

The first step in decision analysis is to identify the above three based on a verbal statement of the problem. This process is called problem formulation

Influence Diagram

• We said that decision alternatives, chance events and consequences constitute the problem formulation.

• The influence diagram is a graphical device that depicts the relationship among these three “nodes”.

• In an influence diagram, decision nodes are rectangles, chance nodes are circles and consequence nodes are diamonds.

• Arcs connecting one node to another show the direction of influence, i.e. what leads to what.

Illustration of influence diagram

• Draw on the Board

States of Nature

• Whereas the decision maker has control over the choice of the decision alternative, the outcome of chance events are beyond his control and decided by nature.

• The possible outcomes are referred to as “States of Nature” One and only one of these states will occur.

Payoff

• The consequence resulting out of a decision alternative and a state of nature is called payoff.

• A table that shows the payoffs for all possible combinations of decision alternatives (as rows) and states of nature (as columns) is called a payoff table.

Illustration of a payoff table

• Draw on the board

Decision Trees

A graphical representation of the decision making process and its sequence.

• The decision tree will have numbered nodes which may be decision points or chance events.

• The branches that leave a decision node are the decision alternatives.

• The branches that leave a chance node are the states of nature.

• Payoffs are shown at the end of states of nature branches.

Steps to draw a decision tree

• Identify all the decisions (and their alternatives) to be made and the order in which they are to be made.

• Identify the chance events that occur after each decision.

• Draw a tree diagram showing the sequence of decisions and the chance events. Use squares for decision nodes and circles for chance nodes

Illustration of a decision tree

• Draw on the board

Decision making

• Decision making without probabilities– Probabilities may not be available– A simple best-case / worst-case analysis is desirable.

• Decision making with probabilities– Probabilities of the states of nature can be determined– The stakes are high enough to justify a detailed

analysis.

Decision making without probabilities

• Optimistic Approach (Best of Best)

• Conservative Approach (Best of Worst)

• Minimax Regret Approach (Minimum of maximum regret)

Best : Maximum payoff in the case of maximisation and minimum payoff in the case of minimisation problem.

Optimistic Approach

• Draw the payoff table and add a column at the end.

• Enter the row-best value in the cells of this column.

• Select the best value in this column.

<Illustrate on the board>

Conservative Approach

• Draw the payoff table and add a column at the end.

• Enter the row-worst value in the cells of this column.

• Select the best value in this column.

<Illustrate on the board>

Minimax Regret

• Using the payoff table, prepare a regret table by replacing each entry with its regret. The regret is the difference between the payoff and with the best payoff in that column.

• Add a column at the end and enter in it, the row maxima.

• Select the minimum value in the last column.

<Demonstrate minimax regret on the board>

Decision Making with Probabilities

Expected Value of a Chance Node

• Consider a chance node that has a number of “states of nature” branches leaving it.

• Each branch has a payoff and an associated probability of occurrence

• For each “state of nature” branch, find the product of the pay off and the probability. Add these products for all the branches leaving the chance node. This is the EV of this chance node.

Decision Making with Probabilities

Expected Value of a Decision Node

• Consider a decision node that has a number of “decision alternative” branches leaving it.

• Each branch has a payoff, which is the expected value of the node where the branch terminates

• Determine the best of the expected values for all the branches leaving the decision node. This is the EV of this decision node.

Expected Value of Decision Tree

• The two rules will be sequentially used to determine the expected value of a decision tree. This technique is called Roll back technique, because we start at the last node and go backwards. The two rules are repeated below.

• Rules for decision nodes and chance nodes– Chance node: The expected payoff is the weighted

average payoff, weights being the probabilities.– Decision node: The expected payoff is the value of

the branch with the best expected value

Roll-back technique

Under this technique, we start from the last node in the sequence and work backwards to the first node.

See the illustration and also the excel format.

Types of Expected Value

• Expected Value of a decision tree– Discussed in the previous slides– Here the decision maker uses “prior probabilities”.

based on a preliminary assessment of the states of nature

• Expected value with information– Here the decision maker undertakes a study, based

on which the prior probabilities are updated to more accurate probabilities called “posterior probabilities”

Expected Value with Information

• The information is usually based on a sample study and is called Expected Value with Sample Information (EV-w-SI)

• Assume for a moment that the information is perfectly accurate. We call it EV-w-PI)

Note: In the case of perfect information, posterior probabilities become unity.

Expected Value of Information

• The difference between the expected values with and without information is called the expected value of the information, EVSI or EVPI as the case may be.

• The efficiency of sample information is defined as (EVSI / EVPI) * 100%.