Upload
ngonhan
View
217
Download
2
Embed Size (px)
Citation preview
12/30/2016
1
Less Risk, Better Talent: Using Assessments the Legal Way
Greg Barnett, Ph.D.VP, Science. The Predictive Index
January 11th, 2017
8,000 CLIENTS
142 COUNTRIES
120,000 TRAINED
18,000,000 ASSESSMENTS
THE PREDICTIVE INDEX
12/30/2016
2
Three Topics To Cover
oPower of AssessmentsoThe Legal Side of AssessmentsoGuidance and Considerations
Q: What is Assessment?A: A method used to provide data/information about a
candidate’s qualifications and potential for performing a job or his/her ability to fit and develop a career within the organization in which the job will be performed.
Interviews, resume reviews, screening, and background checks are considered to be assessments
BELIEVE IT OR NOT!
The Funnel: A Process Model for Assessment
Level of Investment
12/30/2016
3
7
Personality / Behavioral
AssessmentsCognitive
Assessments
Popular Assessments
Interviews
7
Skills
The Power of Assessments
More Accurate
Talent Decisions
More Efficient
Talent Decisions
More Objective
Talent Decisions
More Informed
Talent Decisions
More Accurate Talent Decisions
14%Interviews Alone
26%+ Reference Checks
38%+ Personality Assessment
54%+ Cognitive Ability
John E. Hunter and Ronda F. Hunter, "Validity and Utility of Alternative Predictors of Job Performance," Psychological Bulletin, Vol. 96, No. 1, 1984; Robert P. Tett, Douglas Jackson and Mitchell Rothstein, "Personality Measures as Predictors of Job Performance: A meta-analytical review," Personnel Psychology, Winter 1991. Michigan State University School of Business.
12/30/2016
4
More Efficient Talent Decisions
More Objective
o Anchoring
o Confirmation Bias
o Illusory Correlation
o Halo Effect
o Social Comparison Bias
o Similarity-Attraction
95 Decision-Making Biases48 Memory Biases26 Social Biases
Over 150 Cognitive Biases
More Informed
12/30/2016
5
FAIRNESS
• EEOC• Uniform Guidelines• Adverse Impact• Job Relatedness• Diversity x Validity Dilemma
EEOC
The U.S. Equal Employment Opportunity Commission (EEOC) is responsible for enforcing federal laws that make it illegal to discriminate against a job applicant or an employee because of the person's race, color, religion, sex, national origin, age (40 or older), disability or genetic information.
12/30/2016
6
Uniform Guidelines on Employee Selection Procedures
• Adopted in 1978• Sets legal and validation standards• Defines – Fairness: Adverse Impact• Applies to all employment decisions• Applies to all decision tools
In 1978Laverne & Shirley #1 Show
Cost of Superbowl Ad: $162KThe Dow: 805
Average Income: $17KSpace Invaders
Adverse Impact
o Appear neutral but have a discriminatory effect
o Substantially different rate of selection in employment decisions (4/5ths rule)
o Not intentional discrimination (Disparate Treatment)
100 100
80 40
80% 40%.40/.80 = 50%
APPLICANT POOL
NUMBER HIRED
SELECTION RATE
ADVERSE IMPACT RATIO
4/5ths Rule of Adverse Impact
12/30/2016
7
ADVERSE IMPACT?ADVERSE IMPACT?
Job Related & Consistent with
Business Necessity?
Job Related & Consistent with
Business Necessity?
DEFENDANT
Alternative Practice?NO
PLAINTIFF
YES
NO YES
Legal Path
DISPARATE IMPACT
90% Lose RateAverage of 1.5 million
BUT…83% of all EEOC employment
discrimination complaints on the selection process not the selection tools
Job Relatedness
Content Validity (80%)
Criterion Validity (15%)
Construct Validity (5%)
12/30/2016
8
Content ValidityIs the “test” representative of the content of the job?
Critical tasks/behaviors and the KSAO’s
needed to perform
Tests for those KSAO’s
Better Overlap = Better Content Validity
Job Analysis Assessment
Content Validity
Criterion-Validity
Selection AssessmentPerformance or
other criteriaValidity CoefficientIs there a meaningful and significant statistical relationship?
Supervisor ratings, production, sales numbers, safety violations, absences, etc.
12/30/2016
9
• Need “enough” people to do good science• Small & medium sized companies lack numbers• Not possible with roles with few people
Sample Size Problem
• Borrowing validity from similar jobs or from meta-analyses
• Science supports but law doesn’t always
Transport Validity / Validity
Generalization
A Note: Criterion Validity
So why ever use a test that can cause adverse impact?
Some critical abilities are not distributed equal across groups
Unfairness to current employees by hiring poor performers
Removes subjectivity and intentional discrimination from hiring process
Diversity – Validity Dilemma
Sacrifice validity• Use less valid selection
tests that do not result in adverse impact to achieve social, ethical, or business aims
Sacrifice validity• Use less valid selection
tests that do not result in adverse impact to achieve social, ethical, or business aims
Sacrifice diversity• Ignore the potential
adverse impact of valid selection procedures to achieve different social, ethical, or business aims.
Sacrifice diversity• Ignore the potential
adverse impact of valid selection procedures to achieve different social, ethical, or business aims.
12/30/2016
10
Americans with Disabilities Acto Reasonable Accommodations During Hiring Process:
o Alternate test formatso Extended timedo Brailleo Screen readerso Assistants
o No Accommodation: Undue Hardship or Job Relatednesso Assessment is highly similar to job (e.g. physical tests)o No reasonable alternate to measure construct (e.g. data entry
speed)
What Can You Do?
Choose an implementation approachDo a “job analysis”Choose the right assessmentsChoose when you use themDo validation (if you can)Choose experienced vendorsBe responsible
ALGORITHM DATA POINT
IMPLEMENTATION APPROACH
12/30/2016
11
Algorithm Approacho “System” makes decisiono Pure odds-makingo Recommendations
o Cut-off Scores o “Red/Yellow/Green”o Auto Ranking
o Higher Volume (larger employee pops)
PROS• Most efficient use of
assessments• No subjective bias• No training needed• Reduces recruiter, HR,
hiring manager wasted time
Cons• Need serious “job
relatedness” documentation
• Must carefully monitor selection rates
• Odds making can be a cold way to hire
• Managers feel control taken from them
Puts Legal Pressure on Tools
Data Pointo People make the decisionso Objective data informs decision-makingo Each Assessment is a “data point”o Training needed to interpret and use toolso Better for smaller volume roles
12/30/2016
12
PROS• Understand what makes
people unique• Selection tools don’t
reject candidates• People become smarter
about talent• Managers still in control
Cons• More room for
subjective bias• Need to train (and keep
training)• Confusion about how to
weight different tools• Need to build “buy-in”
Puts Legal Pressure on System
RIGOROUS NOTHING
JOB ANALYSIS APPROACH
Understand Tasks + KSAO’s required for the job. Best practice but intensive. Info can be used for assessments choice PLUS many HR needs (e.g., performance, job descriptions, etc.)
Never good… Is the assessment measuring something needed for the job? “How much” of what the test measures do you need?
TEST-SPECIFIC
Understanding job requirements in the
context of the assessment (e.g., how much extraversion is needed). Be sure to
involve multiple SMEs in process
VERY PREDICTIVE
BUT LESS FAIR
LESS PREDICTIVE BUT MORE
FAIR
ASSESSMENT CHOICES
A MIX
12/30/2016
13
Assessment Choices
37
Assessment Choices
38
Assessment Choices
39
Unstructured Structured
12/30/2016
14
Assessment Choices
40
Unstructured Structured
SCREEN OUTEARLY IN PROCESS
SCREEN INLATER IN PROCESS
ASSESSMENT TIMING
More effective system but larger numbers impacted
Less effective use of assessments but fewer people impacted. Utility increased when information is used to drive interviews
MULTIPLE HURDLES
Use assessments according to goal. Safer = fair tools early
LOCAL VALIDATION
OFF-THE-SHELF
RESEARCH APPROACH
VALIDITY GENERALIZATION or TRANSPORT or
TOP/BOTTOM PERFORMANCE
Legally safest but resource intensive and requires large samples
Better than nothing, but still doesn’t hold up well in court. Not all test vendors can do this well.
For fair tools this can be fine. But legally, this is a lose.
12/30/2016
15
VENDOR CHOICE
EXPERTISE FLUFF
EXPERIENCECan vendor provide legal guidance and show assessment predicts job performance?
Does vendor have experience working with companies like yours?
Does vendor say all the right things, but have nothing to back it up?
“Fluff”o “Certified by the EEOC” o Liberal use of “valid” (without proof)o The “Fortune Teller” debriefo Unrealistic accuracy (e.g., 90% accurate)o The AMAZING client logo listo The 10-page technical manual
MONITOR OBSSESIVELY
CLOSE EYES AND HOPE
RESPONSIBILITY
Whether it is the tool or your entire system. You are responsible for knowing your applicant flow. You
should do so at least annually.
12/30/2016
16
ADVICE
Lower Volume
More Data Point
Test-Specific Job Analysis that
Involves Multiple SMEs
Interpretation Training for all
Users.
Order doesn’t matter,
weighting tools equally does
Experienced Vendors
Monitor Overall Selection System Adverse Impact
Frequently
Higher Volume
More Algorithmic Through Job Analysis
Solid Local Validation Multiple Hurdles Expert Vendors
Monitor Assessment
Adverse Impact Obsessively
Conclusiono Use Assessments!o Yes there are risks…
o Most lawsuits are about the processo Be thoughtful about goals and approacho Don’t be afraid to use an assessment that causes AIo But be responsible in your implementation and monitoring
o Anything is better than biased and inaccurate unstructured interviews