Gage a a A

Embed Size (px)

Citation preview

  • 8/3/2019 Gage a a A

    1/3

    Attribute Agreement Analysis

    Overview | How to | Data | Example

    Overview

    The Attribute Agreement Analysis is used to assess the accuracy of subjective ratings by people. In general,it is more likely that subjective ratings are accurate and useful if there is substantial agreement inmeasurements among appraisers.

    How to

    1. Choose ProcessMA > Quality Tools > Attribute Agreement Analysis.

    2. In Rating, select the column containing the measurement data.

    3. In Samples, select the column containing the sample indicators.

    4. In Appraisers, select the column containing the appraiser indicators.

    5. Click OK.

    Optional

    6. In Known standard, select the column containing the known standard or master value for each sample.

    7. Check Attribute data is ordered, if your measurement data have more than two levels and are ordinal.

    8. Check Show Kappa and Kendall coef, if you want to display the kappa coefficient tables and Kendall'scoefficient tables.

    Note To select a column of data into a textbox, double-click on any of the column names shown in thelist on the left of the dialog box while in the textbox.

    Data

    Rating: Text or Numeric.

    Samples: Text or Numeric; Must contain equal number of data points as the Rating.

    Appraisers: Text or Numeric; Must contain equal number of data points as the Rating.

    Known standard: Text or Numeric.

  • 8/3/2019 Gage a a A

    2/3

    Example

    You work in a garment factory and you have just trained 4 new quality controllers. The quality controllersneed to determine is the garments are up-to-standard. You want to assess if these new quality controllersare ready for the job. You asked each quality controller to give their ratings on 10 garments on a five-pointscale (-2, -1, 0, 1, 2).

    1. Open worksheet Gage.xls.

    2. Choose ProcessMA > Quality Tools > Attribute Agreement Analysis.

    3. In Rating, select C Rating.

    4. In Samples, select B Garment.

    5. In Appraisers, select A Controller.

    6. In Known standard, select D Standard.

    7. Check Attribute data is ordered.

    8. Check Show Kappa and Kendall coef.

    9. Click OK.

    Attribute Agreement Analysis: Rating

    Each Appraiser VS Standard

    Assessment Agreement

    Appraiser # Inspected # Matched Percent 95% CI

    Frances 10 9 90 (55.5, 99.75)

    Jane 10 9 90 (55.5, 99.75)

    John 10 6 60 (26.24, 87.84)

    Mary 10 10 100 (74.11, 100)

    Between Appraisers

    Assessment Agreement

    # Inspected # Matched Percent 95% CI

    10 2 20 (2.521, 55.61)

    All Appraisers VS Standard

    Assessment Agreement

    # Inspected # Matched Percent 95% CI

    10 2 20 (2.521, 55.61)

    Appraiser VS Standard

    Frances Jane John Mary20

    40

    60

    80

    100

    120

    Interpretation

    The Each Appraiser VS Standard assessment agreement table shows that John was only able to matched 6out of the 10 assessments while Mary was able to match all of them. The confidence interval of %Matched

  • 8/3/2019 Gage a a A

    3/3

    is shown in the table and also plotted as a chart. Based on this study, you conclude than John is in mostneed of more training.