8
DRDC-RDDC-2015-P167 Sensitivity Analysis of Bid Evaluation Plans for Defence Acquisition Problems Ahmed Ghanmi Defence Research and Development Canada--Centre for Operational Research a nd Analysis Ottawa, Ontario, Canada, KIA OK2 [email protected] Abstract-This paper presents a simulation-based method for performing high dimensional sensitivity analysis in multi-c1iteria decision making problelllS. The method integrates multi-c.iiteria decision analysis, Monte Carlo simulation, and optimization techniques to assess the sensitivity of c1ite1ia weights and performance values of alternatives. It determines a ranking p1·obability matrix by simulating stochastic input parameters and calculates the most probable rankings of alternatives using an assignment algorithm. The method was developed to assess the robustness of bid evaluation plans for defence acquisition p1·oblems. It would allow decision makers to better understand the impact of uncertainty in criteria weights on bid performance and to further refine bid evaluation plans. An illustrative example using a military aircraft acquisition scenruio was p1·esented and discussed. Keywords-simulation; sensitivity a11a(ysis; criteria weights; bid evaluation. I. INTRODUCTION In decision themy, sensitivity analysis is a fundamental concept for assessing the stability of an optimal alternative under changes in input parameters, the impact of the lack of controllability of ce11ain input parameters, and the need for precise estimation of input parameter values [l]. This paper examines sensitivity analysis of bid evaluation plans for defence acquisition problems. The acquisition process of defence systems requires the development of bid evaluation plans for assessing the perfonnance of al ternatives (bidders). A bid evaluation plan is a technical framework that details the evaluation approach of bid proposal s, including the evaluation criteria, their associated weight fac tors, and the selection method that will be used to differentiate between bids. Decision makers sometimes use Multi-Criteria Decision Analysis (MCDA) methods for developing bid evaluation plans. The general form of a MCDA problem is the evaluation of a number of alternatives against multiple decision criteria. The mo st commonly used MCDA approach for bid evaluation plans of defence acquisition is the additive weighted scoring method where relative weights are assigned to criteria and bidders are rated per criterion. The criteria weights and rating functions are fixed and publicly announced in the request for proposal. After bid submi ssion, the bidders are scored by taking the sum of the products (weights times ratings) over all criteria. The highest scoring bidder wins the contract. Although the additive weighted scoring method is 1nathematically simple and easy to implement, it is known to suffer from potential consistency and validity problems due to subjective weight selection [2]. Ideally, criteria weights are arithmetically or objectively measmed; however in most cases this is not possible. Lacking such measurements, the detennination of appropriate weights that accurately reflect the decision maker's priorities is known to be a cognitively deinanding task; it is difficult to objectively quantify the relative impmtance of criteria. Moreover, criteria weights for public contracts should be defined and published before the bids are known. This introduces another complexity in using MCDA for bid evaluation problems, since the elicitation of weights without reference to their doinains of variation (and their impacts on the evaluation) is theoretically inconect and has no 1nathematical meaning in the framework of an additive aggregation model [3]. As such, sensitivity analysis ofMCDA input parameters and assumptions should be conducted to explore the whole space of criteria weights and to asse ss the robustness of an evaluation plan. Sensitivity analysis would also allow decision makers to identify extreme bids by analyzing different combinations of criteria weights. An extreme bid is defined as a bid that perfonns extremely well on some criteria, but al so extremely poorly on others. Numerous sensitivity analysis methods have been proposed in the literature for MCDA applications. In general, these methods can be classified into detenninistic and stochastic approaches. The detenninist ic sensitivity analysis approach predominantly uses distance measmes in high-dimensional space to quantify the 1ninimum modification of the input parameters that is required to al ter the total values of selected alternatives. Evans [l] investigated linear programming-like sensitivity analysis in decision themy. His approach was based on geometric characteristics of optitnal decision regions in the probability space. He inade an analysis on the sensitivity of the optiinal decision to changes on probabilities of the states of nature. Ba1rnn and Schmidt [4] recommended two procedures to accomplish sensitivity analysis in multi-attribute value models. These are an entropy-based procedure and a least squares procedure. For the entropy-based procedure, they assumed nearly equal weights. However, the least squares procedure required a set of arbitra1y weights for the attributes. These procedures calculate for a gi ven pair of alternatives, one of which is the best alternative, the closest set of weights that

Sensitivity Analysis of Bid Evaluation Plans for Defence Acquisition …cradpdf.drdc-rddc.gc.ca/PDFS/unc210/p803020_A1b.pdf · 2016-01-18 · DRDC-RDDC-2015-P167 Sensitivity Analysis

  • Upload
    lykiet

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

DRDC-RDDC-2015-P167

Sensitivity Analysis of Bid Evaluation Plans for Defence Acquisition Problems

Ahmed Ghanmi Defence Research and Development Canada--Centre for Operational Research and Analysis

Ottawa, Ontario, Canada, KIA OK2 ahmed. [email protected]

Abstract-This paper presents a simulation-based method for performing high dimensional sensitivity analysis in multi-c1iteria decision making problelllS. The method integrates multi-c.iiteria decision analysis, Monte Carlo simulation, and optimization techniques to assess the sensitivity of c1ite1ia weights and performance values of alternatives. It determines a ranking p1·obability matrix by simulating stochastic input parameters and calculates the most probable rankings of alternatives using an assignment algorithm. The method was developed to assess the robustness of bid evaluation plans for defence acquisition p1·oblems. It would allow decision makers to better understand the impact of uncertainty in criteria weights on bid performance and to further refine bid evaluation plans. An illustrative example using a military aircraft acquisition scenruio was p1·esented and discussed.

Keywords-simulation; sensitivity a11a(ysis; criteria weights; bid evaluation.

I. INTRODUCTION

In decision themy, sensitivity analysis is a fundamental concept for assessing the stability of an optimal alternative under changes in input parameters, the impact of the lack of controllability of ce11ain input parameters, and the need for precise estimation of input parameter values [l]. This paper examines sensitivity analysis of bid evaluation plans for defence acquisition problems.

The acquisition process of defence systems requires the development of bid evaluation plans for assessing the perfonnance of alternatives (bidders). A bid evaluation plan is a technical framework that details the evaluation approach of bid proposals, including the evaluation criteria, their associated weight factors, and the selection method that will be used to differentiate between bids. Decision makers sometimes use Multi-Criteria Decision Analysis (MCDA) methods for developing bid evaluation plans. The general form of a MCDA problem is the evaluation of a number of alternatives against multiple decision criteria. The most commonly used MCDA approach for bid evaluation plans of defence acquisition is the additive weighted scoring method where relative weights are assigned to criteria and bidders are rated per criterion. The criteria weights and rating functions are fixed and publicly announced in the request for proposal. After bid submission, the bidders are scored by taking the sum of the products (weights times ratings) over all criteria. The highest scoring bidder wins the contract.

Although the additive weighted scoring method is 1nathematically simple and easy to implement, it is known to suffer from potential consistency and validity problems due to subjective weight selection [2]. Ideally, criteria weights are arithmetically or objectively measmed; however in most cases this is not possible. Lacking such measurements, the detennination of appropriate weights that accurately reflect the decision maker's priorities is known to be a cognitively deinanding task; it is difficult to objectively quantify the relative impmtance of criteria.

Moreover, criteria weights for public contracts should be defined and published before the bids are known. This introduces another complexity in using MCDA for bid evaluation problems, since the elicitation of weights without reference to their doinains of variation (and their impacts on the evaluation) is theoretically inconect and has no 1nathematical meaning in the framework of an additive aggregation model [3]. As such, sensitivity analysis ofMCDA input parameters and assumptions should be conducted to explore the whole space of criteria weights and to assess the robustness of an evaluation plan. Sensitivity analysis would also allow decision makers to identify extreme bids by analyzing different combinations of criteria weights. An extreme bid is defined as a bid that perfonns extremely well on some criteria, but also extremely poorly on others.

Numerous sensitivity analysis methods have been proposed in the literature for MCDA applications. In general, these methods can be classified into detenninistic and stochastic approaches. The detenninistic sensitivity analysis approach predominantly uses distance measmes in high-dimensional space to quantify the 1ninimum modification of the input parameters that is required to alter the total values of selected alternatives. Evans [l] investigated linear programming-like sensitivity analysis in decision themy. His approach was based on geometric characteristics of optitnal decision regions in the probability space. He inade an analysis on the sensitivity of the optiinal decision to changes on probabilities of the states of nature. Ba1rnn and Schmidt [4] recommended two procedures to accomplish sensitivity analysis in multi-attribute value models. These are an entropy-based procedure and a least squares procedure. For the entropy-based procedure, they assumed nearly equal weights. However, the least squares procedure required a set of arbitra1y weights for the attributes. These procedures calculate for a given pair of alternatives, one of which is the best alternative, the closest set of weights that

equates their rankings. Mareschal [5] proposed a sensitivity analysis method whereby stability inte1v als for the weights of different criteria are defined. These consist of the values that the weight of one criterion can take without altering the results given by the initial set of weights, all other weights being kept constant. Rios Insua [ 6] developed a methodology for sensitivity analysis in multi-objective decision making. He introduced a general framework for sensitivity analysis that expanded results of the traditional Bayesian approach to decision making. Emphasis was given to cases that use partial data, doubtful data, or both. Rios Insua and French [7] developed a conceptual framework for sensitivity analysis in MCDA which allows simultaneous variation of criteria weights. Their method identifies the smallest changes necessa1y in the input parameters before a significant change in the ranking of the alternatives occurs. This is achieved by using the Euclidean or the Chebyshev distance metric. Triantaphyllou and Sanchez [8] developed a method that involves perfonning sensitivity analysis on the weights of the decision criteria and the scores of alternatives expressed in terms of decision criteria. They focused their eff01t on determining the most critical criterion, defined as the criterion whose smallest change in weight resulted in a different orde1ing of alternatives. Chen and Kocaoglu [9] developed a sensitivity analysis algorithm to study the robustness of hierarchical decision models to changes in eve1y local contribution mattix at different levels. The algorithm is independent from the paitwise comparison scales and judgment quantification techniques and is applicable to all hierarchical decision models based on an additive relationship. More recently, Kaluzny and Shaw [10] used high-dimensional computational geometty, namely polyhedral the01y to define three measures of sensitivity analysis (volume, distance, and representativity). Using polyhedra, they suggested a linear progranuning approach to classify the weight space exactly. They applied the methodology on an options analysis problem of the Canadian Surface Combatant acquisition project and demonstrated that the overall ranking of options is stable against moderate variations in the weights.

An alternative approach to detenninistic sensitivity analysis it1 MCDA context is through simulation. A commonly used stt·ategy to implement these types of sensitivity analyses is the Monte Carlo simulation approach, which generally requit·es the decision makers to specify variation ranges for the criteria weights and scores. There has been significant fewer stochastic sensitivity analysis methods proposed in the literature, compared to the detenninistic methods. Janssen [11] it1tt·oduced a procedure to analyze systematically the sensitivity of the ranking of alternatives to overall uncertait1ty it1 the MCDA it1put parameters using Monte Carlo simulation. The unce1tainty it1 the input parameters was described by the maximum percentage of variation for criteria weights and ratings and was represented by probability distributions. However, Janssen's approach does not consider unce1tait1ties it1 all input parameters simultaneously and uses only one type of probability distribution to represent parameter unce1tait1ties. Butler et al. [12] proposed a sensitivity analysis method that utilises Monte Carlo simulation to va1y all of the criteria weights of MCDA model simultaneously. The method randomly samples from the entire weight space to inlplicitly

classify the weights by the rank orderings they produce. In addition the method investigates the inlpact of Vfilying the functional fonn of the multi-attti.bute aggregation. Hyde et al. [13] proposed a stochastic sensitivity analysis method that involves defining the unce1tainty in the input pfil·aineters using probability distti.butions, pe1forming Monte Cfilfo simulation, filld unde1takillg a significfil1ce analysis using the Spefilmfill rfillk correlation coefficient. The outputs of this method include a distt"ibution of the total values of each alternative based upon the expected range of input parameters.

Ghanmi [14] proposed a sitnulation-based sensitivity fillalysis method for MCDA problems filld applied the method to the analysis of the bid evaluation plfil1 for the acquisition of a militfily sefil·ch and rescue ait·craft. The method integrates MCDA, Monte Carlo simulation, and optitnization techniques to assess the robusmess of bid evaluation plans. This paper discusses the proposed method with an illustrative exfilnple.

The paper is divided into five sections and sttuctured as follows. Section 2 describes bid evaluation plfil1s for defence acquisition problems filld discusses c1i.teria weighting and scorit1g methods. Section 3 presents the proposed simulation-based sensitivity analysis method and algorithm, and section 4 provides fill illustrative exainple of bid evaluation plan sensitivity analysis. Concluding remfil·ks filld future work are indicated in the fifth section.

ll. BIDEVALUATIONPLAN

From a defence acquisition perspective, a bid evaluation plan is a framework that desc1i.bes the evaluation approach for the procurement of a defence system, including details about the evaluation criteria, criteria weights, scoring methods, bid selection process, etc. Developit1g a bid evaluation plan for defence acquisition is an iterative process that mvolves multidisciplinfily decision makers, includmg financial, operational, technical, legal, and stt·ategic analysis tefilns. The establishment of decision c1i.teria is fill inlpmt ant step in the bid evaluation process. Crite1i.a and then· assigned weights vaiy by the type of system (e.g., ait·craft, ship, ground vehicle) that is being procured. They could also be different for it1-service (e.g., upgrade of a cuffent system) and on-paper (new procurement) systems.

To facilitate the bid evaluation process, decision makers sometimes use MCDA methods in the development of bid evaluation plans. One of the most frequently used MCDA methods is the additive utility model, which requit·es that the crite1i.a fil·e it1dependent and have cfil·dinal weights. Let n be the number of evaluation crite1i.a, wk, the relative weight of crite1i.on k (k = 1, 2, ... , n ), and lh, the perfonnai1ce value of a bid for crite1i.on k. The overall perfonnai1ce value of the bid (JI), taking all crite1i.a mto consideration simultaneously, is given by the following additive expression:

n V = L wkuk

k =I

n

L Wk k = I

(I)

A. Evaluation O·iteria The bid evaluation criteria for the acquisition of defence

systems are usually grouped into two main categories -technical and.financial criteria.

• Technical criteria. Technical criteria are used to assess the perfonnance of a defence system against the technical requirements defined in the request for proposals. They are usually divided into mandatmy and rated criteria. Mandatmy criteria (e.g., payload, range, speed) identify the minimum requirements that are essential to the successful development of the system. Rated criteria (e.g., reliability, engineering suppmt) are used to determine the relative technical merit of each bid.

• Financial criteria. Criteria for the financial evaluation of a defence system would include: acquisition cost, risk delive1y cost, operations and maintenance costs, as well as other additional work request cost.

B. O·iteria Weighting Methods The weight assigned to a criterion is basically a scaling

factor which associates scores for that criterion to scores for all other criteria. Different weight elicitation techniques have been presented in the open literature. They can be classified into two main categories - rating and rank-order procedures. The rating procedures maintain ratio scale prope11ies of the decision-maker's judgments from extraction and use exact values for representation and inte1pretation. Common to all these procedures is that the actual attribute weights used for the representation are derived by normalizing the sum of given points to one. Accurate detenninations of criteria weights using the rating procedures are often hard to obtain in practice since assessed weights are subject to response enor [3]. Consequently, other procedures based on the determination of a relative order of imp011ance of the criteria (i.e. , rank-order) have been proposed.

The rank-order procedures detennine ordinal values of criteria that are conve1ted to sunogate (cardinal) weights consistent with the criteria rankings. Several proposals for conve11ing rankings to numerical weights have been proposed in the literature. The most prominent schemes are: rank sum weights, rank reciprocal weights, and rank order centroid weights. The rank sum (or rank linear) weight elicitation method uses a linear progression to derive weights consistent with the criteria ranks. The rank reciprocal (or inverse) weight elicitation method derives weights propo11ional to the inverse of the criteria ranks . The rank order centroid method produces an estimate of weights that minimizes the maximum enor of each weight by identifying the centroid of all possible weights while maintaining the rank order of objective importance. Other weight allocation methods using nonlinear relationships between criteria weights and rankings can also be folllld in the literature (see for example [15})

A comparison of weights for five criteria using the three conversion methods (rank sum, rank reciprocal, rank centroid) is depicted in figure 1. The rank reciprocal and the rank centroid methods assign more weights to the first ranked criterion than the rank sum method and vice versa for the second, third and foUith criteria . In general, there is no mle for

using a paiticular conversion method for a given decision problem However, a sensitivity analysis should be perfonned to assess the impact of each method on the perfonlllli1ce values ai1d the overall rankings of bids.

0.5

0.4

iii 0.3

~ 0.2

0.1

0

' ' ' ' ' ' --.... "", --.. ',, ___ _

2

- -+ - Rank sum

---• -- Rank reciprocal

--.!!I- Rank centroid

.................. _ ...

3 4

Order Fig. I . Comparison of weights using different elicitation methods_

C. Bid Scoring Methods

5

Different ai1alytical methods were proposed in the literatUI·e for scoring the bid performance with respect to the technical ai1d finai1cial criteria and sub-criteria. For the rated technical criteria, scoring methods range from qualitative judgments of ai1 expe11 to sophisticated mathematical models. The most commonly used scoring technique for bid evaluation of public procurements is the point allocation method. This method requires the decision maker to assign a hypothetical number of points (e.g., 0 - 100) to bids so as to reflect their perfonnances with respect to different criteria. The score allocation is based strictly upon a decision maker's subjective judgments.

There is no established methodology for assessing the finai1cial performance of bid proposals. In general, a bid that exceeds a threshold cost (i.e., budget limit) will not be considered in the evaluation process. In this paper, a cost-based scoring approach was proposed to detennine the acquisition cost scores of defence systems. The approach establishes inathematical relationships (e.g., lineai·, exponential, power series) between acquisition scores and costs of bids, taking into consideration the budget constraint for the project. Detailed descriptions of the cost scoring methods can be folllld in [14].

III. SENSTITVTIY ANALYSIS

The bid evaluation problem is a potentially subjective exercise that induces llllce11ainties in the assessment of criteria weights and scores. The analytical methods discussed above could be used to assist decision makers in the elicitation of initial weights and initial scores of the criteria. Fmther examination of the solution space for the criteria weights and scores would be required to detennine a robust bid evaluation plan. This could be done by performing a sensitivity analysis of the input parameters.

Conducting a sensitivity analysis of the criteria weights (ai1d the performance values) is often insightful but traditional sensitivity analysis models typically va1y a single weight and observe the effect on the results of the model. A method of simultaneously vaiying all, or at least a lai·ge subset of the

weights would be useful. This section presents a simulation-based approach for pe1fonning high dimensional sensitivity analysis in MCDA to assess the robustness of bid evaluation plans.

A. O·iteria Weights Simulation Stochastic criteria weights can be generated at random

using Monte Carlo simulation so that the results of many combinations of weights can be explored in an efficient manner. Three main classes of simulation procedures for generating stochastic weights are discussed in the literature [12]: random weights, random weights preserving a rank-order of imp011ance, and random weights from a hypothetical response distribution.

• Random weights. Random weights can be generated using computer simulation programs to explore the entire domain of possible weight combinations. This approach has no prior infonnation as to a decision maker' s preference. To generate random weights for the additive MCDA model, random numbers are selected from a unifonn distribution on (0, I) independently, nonnalized and ranked.

• Rank order weights. In contrast with random weights, rank order weights are randomly generated while preserving their criteria rank order. This procedure restricts the domain of possible weights to those that are consistent with the decision's judgement of criteria imp011ance.

• Resp onse distribution weights. A criterion weight can be represented by a probability distribution function (response distribution). The response distribution procedure considers weight variations in the form of response eirnr to account for uncertainty in weight assessment. To use simulation for exploring the implications of weight variations, the assessed weights are treated as means of probability distributions of responses, and simulated weights are then generated (and normalized) from these distributions.

B. Peiformance Values Simulation A high dimensional sensitivity analysis could be perfonned

to assess the impact of criteria weights, pe1fonnance values of alternatives, or both. As for the criteria weights simulation, stochastic perfonnance values (i.e., technical scores) could be generated using Monte Carlo simulation. For the financial criteria, initial scores for the acquisition cost (for example) could be detennined using a cost-based approach. These scores could be used as means of probability distributions of responses and stochastic financial scores could be generated from these distributions using Monte Carlo simulation. For the technical criteria, stochastic scores could also be generated using a response distribution procedure. Initial scores obtained with a point allocation method (for example) could be used as means of probability distributions.

C. Simulation-Based Sensitivity Analysis A simulation-based method was developed for perfonning

high dimensional sensitivity analysis of bid evaluation plans. The method involves three main steps:

I . The first step of the simulation-based sensitivity analysis consists of detennining the initial input parameters (criteria weights and perfonnance values of alternatives). Initial rankings of alternatives are determined using MCDA methods (e.g., weighted-sum method).

2. The second step consists of generating stochastic parameters using Monte Carlo simulation by pe1turbing the initial parameters randomly. The process is repeated in order to detennine the probability with which each alternative will achieve each rank. The rank probabilities from Monte Carlo simulation can be organized into a ranking probability matrix in which each enby r0 is the probability that alternative i ranks at position j. These probabilities are used to assess the robustness of the alternative rankings. The values of rif are calculated by dividing the number of times alternative i ranked at a position j by the total number of simulation mos. The ranking probability matrix has the following prope1ties:

m

L > lj I < . - } ~ m (2) i = l

m

L ril I ~ i ~ m (3) j = I

Since the scores in each mn are random, the portion of mos that contain tied alternatives is negligible. For the purpose of generating the rank probabilities, therefore, such ties are broken in an arbitra1y manner.

3. The third step uses the ranking probabilities developed in step 2 to detennine the most likely alternative rankings. The condition that defines a pa11icular ranking as most probable is described in the context of an assignment problem. The most probable rankings of alternatives and their ranking probabilities are then found by solving the assignment problem using the ranking probability 1natrix. The assignment problem detennines an assignment of alternatives to the set of rank positions without assigning a rank more than once and ensuring that all alternatives are ranked.

The assignment problem can be mathematically formulated as follows. Let xif be the assignment variable of alternative i to rank position ) (xii = I if alternative i is ranked at position ) , 0 otherwise). The objective function of the ranking assignment problem is to maximize the sum of the ranking probabilities of alternatives.

m m max Z LL Xlj lij (4)

i =l j =I

Subject to:

(a)

m Lxil = I (b) i= l

Xi/ E{O, l} (c)

The set of constraints given in (a) require that eve1y alternative is assigned to exactly one rank position, while the set of constraints specified in (b) require that eve1y rank position is assigned exactly one alternative. Constraints on the domain of the variables are specified in (c).

~1 ~e course of fu~ding the set of binary xif values that maxmnze Z, the constramts ensure that for any value of i x. 1·s ., u 1 for only one value of j. As such, the rank order assigned to alternative i (R;) and its component-level assignment probability (P;) can be extracted from the solution values for x0 as follows:

m R; 2: Jxij 1 :-::; i $ m (5)

j = I

m P; L xiirif 1 :-::; i :-::; m (6)

j = I

D. Simulation-Based Sensitivity Analysis Algorithm The algorithm describing the Simulation-Based Sensitivity

Analysis (SBSA) method involves defining the uncertainty in the input values, performing a stochastic sensitivity analysis, calculating ranking probabilities, and unde1taking a significance analysis:

• Uncertainty in the Inp ut Values. The uncertainty in the quantitative input values (e.g., criteria weights and perfonnance values of alternatives) can be represented by continuous probability distributions. Different probability distributions (unifonn, nonnal, triangular, etc .) can be used to generate stochastic input parameters, depending on the availability of data.

• Stochastic Sensitivity Analysis. Once the input values are represented stochastically, Monte Cai·lo simulation is used to enable repeated applications of the MCDA method with a range of possible input values. At each simulation iteration, ilie overall scores and rankings of alternatives are dete1mined using MCDA. A ranking probability matrix of the alternatives is ilien dete1mined based upon the expected range of possible input values for each criterion weight and score. The probability matrix is calculated by dividing ilie number of times an

70

Technical

alternative is ranked at a given pos1t1on by ilie total number of iterations in ilie simulation.

• Ranking Probabilities. Using the ranking probability matrix, the most probable ranking of each alternative is dete1mined by solving ilie assigmnent algoritlun.

• Significance Analysis . The ranking probability matrix and the assignment probabilities of alternatives could be used to analyze ilie sensitivity of final rankings of alternatives to changes in the criteria weights and to assess ilie robustness of an evaluation plan.

IV. ILLUS1RA TIVE EXAMPLE

In this section, the SBSA method was used to assess the robustness of the bid evaluation plan for the acquisition of a fleet ofmilita1y aircraft for ilie Canadian Anned Forces (CAF). To avoid issues wiili classified info1mation, generic evaluation criteria and hypothetical criteria weights were used in ilie evaluation plan for illustration purposes. A set of hypoilietical bids was also used in ilie analysis.

A . Scenario and Data Figure 2 depicts the bid evaluation criteria and ilieir relative

weights (in percentage). Four technical criteria (Ti, T2, T3, and T4) and two financial criteria (F1> Fi,) were considered for ilie bid evaluation plan. The technical criteria are used to measme ilie capability pe1f01mance indicators such as ilie aircraft weight, range, payload, etc. The financial criteria ai·e used to assess the different costs of aircraft such as the acquisition cost, m-service support cost, additional work cost, etc .

Five hypothetical bids (denoted by letters A through E) were used in the analysis. In practice, technical scores of bids are dete1mined by subject matter expe1ts (SJvIEs) using ilie system requirements and different capability assumptions. Financial scores are also dete1mined by SMEs using various cost-based scoring models (e.g. , lineat', exponential, etc.) as discussed in Section 2. However, this requires an estimation of ilie different aircraft costs that could be obtained using cost estimation models and historical financial data. For ilie plllpose of this paper, hypothetical technical and financial scores are generated at random (using points allocation between O and 100) for ilie different bids (Table 1 ).

Evaluation C1ite1ia

30

Financial

80 20

..___F____,1 ) (..___F____.2

Fig. 2. Evaluation criteria and relative weights (hypothetical).

TABLE L HYPOTIIETICALBID SCORES

Technical Scores(%) Financial Scores (%)

Bid

T, T, T, T, F1 F ,

A 80 65 80 85 60 90

I ......

ml 65 -- 80 B 70 75

! 60 "" f """"""""""~ '~"""""""""""l

1-D----+---65--1--5-0--1--~~"""""""""""""""""""""~'~"""""""""""""""""""""~'~"""""""""" """"""""""";~"""""""""""[

c 90 80

E 60 70

B. Bid Evaluation Sensitivity Analysis A sensitivity analysis of the bid evaluation criteria was

conducted using the SBSA method to detennine how likely a given bid is to be ranked at a particular position and to assess the robustness of the bid rankings. In the analysis that follows, a response distribution method (unifo1m distribution using ±10% variation with respect to the baseline data) was used to generate stochastic criteria weights and scores. Random perturbations were applied to all criteria and sub-criteria weights as well as to the technical criteria and financial scores. Note that random pe1turbations that bring scores (or weights) above 100 or below zero were not allowed in the simulation.

Table 2 shows the ranking probability matrix (in which probabilities are expressed as percentages) obtained using the SBSA algorithm for 100,000 simulation tuns. For each bid, the ranking probabilities are highlighted in the table. The probability matrix can be interpreted as follows: Bid A (for example) ranked first for 10% of the simulation runs, second for 63%, third for 22%, foutth for 5% and never ranked fifth. The same rationale applies for the remaining bids. The exa1nination of the ranking probability matrix indicates that the expected order of the bids and their respective ranking probabilities (highlighted in Table 1) are: B (88), A (63), C (45), D (49), and E (78). Given that bids B and E have high ranking probabilities (i.e. , greater than 75%), their positions would unlikely be sensitive to the criteria weights and scores. In particular, bid B would likely be in the first position (88%) and bid E would likely be in the last position (78%). As bids C, and D have comparable ranking probabilities (i.e., between 45 and 49), their positions will likely be more sensitive to the criteria weights and scores.

~~"""""""""""""""""""""~'~""""""""""]""""""""""'~'~""""""""""l""""""""""; ·~~"""""""""'

TABLE II. RANKING PROBABILlTY MATRIX

Rank Position Bid

1 2 3 4 5

A 10 63 22 5 0

B 88 10 1 1 0

' ;

c 2 20 45 27 6

D 0 7 28 49 16

E 0 0 4 18 78

The SBSA method was also used to assess the impact of the ratio of technical to financial criteria weights on the bid performance scores. Different ratios of technical/financial criteria weights (60/40, 65/35, 70/30, 75/25, 80/20) were examined. For each technical/financial ratio, a ranking probability matrix was simulated and the most probable ranking of each bid was dete1mined using the assignment algorithm. Table 3 presents the bid orders and their expected ranking probabilities (%) for the different technical/financial ratios. The analysis indicates that the bid rankings are sensitive to the technical/financial weighting ratio, except for bids B and A which ranked first and second, respectively, for all ratios.

TABLE ill_ BID ORDERS AND RANKING PROBABILITIES FOR DIFFERENT RATIOS OF TECHNICAL TO FINANCIAL CRITERIA WEIGHTS

Ratio of Technical to Financial C1ite1ia Weights Rank

Position 60140 65135 70130 75125 80120

I B (99) B (97) B (88) B (67) B (34)

2 , __ ,

A r .:o\ ,!, ( ,;<.\ ' _ , A (36) ,... ~ Jj ,... ~., ' I

3 - "' c (45) c (34) LJ l-J U) -, '-' ' ' I

4 - D (49) :;:; (-.9) L. l 't l ) LJ l-' 1 )

5 E (83) E (81) E (78) E (63) D (55)

The optimal technical/financial weighting ratio (for the bids under consideration) would be between 75125 and 80/20. The optimal ratio is the one of the five examined ratios that would allow higher competition between the bids (i.e., the ratio that would allow the greatest range of variation in the output rankings, or the ratio that provides comparable ranking probabilities).

Another imp01tant factor for consideration in the evaluation plan of milita1y systems is the ratio of the acquisition cost (i.e., capital cost) to the operations and maintenance (O&M) costs. O&M costs include consumable, engineering se1vices, repairs, overhaul and spare parts costs. O&M costs are imp01tant in the evaluation of the total ownership cost of militaiy equipment as they could exceed the cost of capital over the life of the equipment.

Assume that F 1 is the criterion for the acquisition cost and Fi is the criterion for the O&M costs. A sensitivity aiialysis was conducted using the SBSA method to assess the impact of the acquisition/O&M weighting ratio on the bid raiikings. The analysis was perfo1med using a technical/financial ratio of 75125. Table 4 presents the bid orders and ranking probabilities (%) for different Fil Fi weighting ratios.

The analysis indicates that the bid raiiking positions would be sensitive to the Fil Fi ratio, except for bid C which ranked third for all Fil Fi ratios. In patticulai·, bid A ratiked first ai1d bid B raiiked second for the 50150 and 60140 ratios. However, their ranking positions ai·e reversed for the remaining Fil Fi ratios. The sensitivity of the ranking positions of bids D and E are relatively unirnp01tai1t as these bids are in the bottom of the list. Unlike the technical/financial ratio analysis, it is not obvious how to identify the optimal Fil Fi ratio.

For a complete assessment of the sensitivity of bid rankings to the Fil Fi ratio, futther analyses should be conducted using different technical/financial ratios. This will allows an exploration of the whole space of decision criteria. It is also imp01t ant to investigate the impact of criteria weighting methods and cost scoring functions on the overall perfonnance of bids.

TABLE IV_ BID ORDERS SND RANKING PROBABILITIES FOR DIFFERENT F1/F2 CRITERIA WEIGHTS

Ratio of F1/F2 a ·ite1·ia Weights Rank

Position 50150 60140 70/30 80120 90110

1 A (63) A (5 1) B (52) B (67) B (79)

2 B (36) B (37) A (41) A (47) A (47)

3 c (31) c (40) c (47) c (47) c (43)

i ' 4 E (47) E (53) E (47) D (51) D (61)

5 D (89) I D (77) I D (57) I E (63) I E (83)

V. CONCLUSIONS

This paper exa1uined some of the bid evaluation problems in militaiy acquisition ai1d developed a simulation-based method for perfonning high dimensional sensitivity analysis of bid evaluation criteria. The method integrates MCDA, Monte Carlo simulation, ai1d optimization techniques to assess the robustness of bid evaluation plans. The approach dete1mines a raiiking probability matrix using stochastic distributions of criteria weights and scores ai1d calculates the most probable raiikings of bids and their ratiking probabilities using atl assignment algorithm. An example application was presented ai1d discussed to illustrate the method.

The SBSA method was used to assess the robustness of the bid evaluation plan for the acquisition of militaiy aircraft for the Canadiat1 Aimed Forces and provided decision makers with a better understanding of the impact of unce1tainty in criteria weights on the bid perfonnance. The method was also used by decision makers to refine and optimize the bid evaluation plan by examining different combinations of criteria ai1d sub-criteria weights.

In militaiy acquisition problems, while scores for rated technical criteria cai1 be dete1mined by assessing the system perfonnance against the statement of requirements, little infonnation is available in the literature for assigning financial scores to bids, patticulai·ly for the acquisition cost of a system. In this paper, a cost-based scoring method was proposed to detennine acquisition cost scores. Different analytical models (lineai·, exponential, power series, hyperbolic, and threshold) for assigning cost scores were investigated. It is recommended that the lineai· model (for its simplicity) or the threshold model (for its practicality - representing a budget constraint) be considered in the bid evaluation process.

Future work on bid evaluation sensitivity analysis would include the development of a decision suppmt tool to implement the SBSA method as well as the exainination of a distai1ce-based sensitivity analysis approach.

REFERENCES

[1] JR Evans, "Sensitivity Analysis I Decision Theroy'', Decision Sciences, voL 15(1), 1984, PP-239-247_

[2] CA Bana e Costa, E_C_ Correa, J_M_ De Corte and J_C_ Vansnick, "Facilitating Bid Evaluation in Public Call for Tenders: A Socio-Technical Approach", Omega - The International Jonmal of Management Science, voL 30, 2002, PP-227-242-

[3] M_ Riabacke, M_ Danielson and L Ekenberg, "State-of-the-Art Perspective Criteria Weight Elicitation'', Advances in Decision Sciences, 2012, PP- 1-24_

[ 4] H Barron and P_ Schmidt, "Sensitivity Analysis of Additive Multivariate Value Models", Operations Research, voL 38, 1998, PP-122-127 _

[5] B_ Mareschal, "Weight Stability Intervals in Multicriteria Decision Aid", European Journal of Operational Research, voL 33(1), 1988, PP- 54-64_

[ 6] D_ Rios Insua, "Sensitivity Analysis in Multi-Objective Decision Making", Berlin, Spimger-Verlag, 1990_

[7] D_ Rios lnsua and S_ French, "A Framework for Sensitivity Analysis in Discrete Multi-objective Decision-making'', European Journal of Operational Research, voL 54(2), 1991, PP- 176-190_

[8] E_ Triantaphyllou and A_ Sanchez, "A Sensitivity Analysis Approach for Some Deterministic Multi-criteria Decision-making Methods", Decision Sciences, voL 28(1), 1997, PP- 151-194_

BIOGRAPHY

[9] H Chen and D_F_ Kocaoglu, "A Sensitivity Analysis Algorithm for Hierarchical Decision Models", European Jonmal of Operational Research, voL 185(1), 2008, PP- 266-288_

[10] B_L_ Kaluzny and RHAD_ Shaw, "Sensitivity Analysis of Additive Weighted Scoring Methods'', Technical Report, DRDC CORA 1R 2009-002, 2009_

[11] R Janssen, "Multiobjective Decision Support for Environmental Management'', Kluwer Academic Publishers, Netherlands, 1996_

[12] J _ Butler, J_ Jia and J_ Dyer, "Simulation Techniques for the Sensitivity Analysis of Multi-criteria Decision Models", European Jonmal of Operational Research, voL 103, 1997, PP- 53 1-546_

[13] K_M_ Hyde, HR Maier and C_B_ Colby, ' 'Reliability-based Approach to Multicriteria Decision Analysis for Water Resources'', Jonmal of Water Resources, Planning and Management, voL 130(6), 2004, PP- 429-438_

[14] A_ Ghanmi, "A Simulation-based Sensitivity Analysis Method for Bid Evaluation problems", Defence Research and Development Canada -Centre for Operational Research and Analysis, DRDC-RDDC-2014-SR, 2014_

[15] D_G_ Hunter and E_J_ Emond, "Analytical Support to PMO JSS'', Defence Research and Development Canada - Centre for Operational Research and Analysis, DRDC-CORA TM 2005-32, 2005

Dr Ghanmi received a B.Sc. degree in engineering, a Master degree and a PhD in applied mathematics from Laval University, Quebec, Canada. He is cmTently a senior defence scientist at the Defence Research and Development Canada (DRDC), Centre for Operational Research and Analysis (CORA). His research interest includes 1nilitaiy operational research, materiel and logistics ai1alysis, decision support ai1alysis, life cycle cost modeling, risk analysis, simulation and optimization. He has published many papers in international refetTed jomnals and conference proceedings, and has authored several technical repmts in DRDC CORA. Dr Ghanmi has led different NATO and TTCP research groups ai1d chaired various technical teams and conference sessions.