Upload
gabriel-barker
View
222
Download
0
Tags:
Embed Size (px)
Citation preview
WP 3: Human Performance Metrics
10 Oct 2002Brian Hilburn
Goals
Extract set of ASAS applicable HumPerf Metrics
Outline application methods (guidance)
Provide primary references
Mapped onto:
- Application (eg Airborne Sep.)
- Validation Type (eg Realtime, etc)
- Agent (ATC vs Flightdeck)
- Human Performance Area
Human Performance Areas
– Workload
– Situation Awareness
– System Monitoring
– Teamwork
– Trust
– Usability and User Acceptance
– Human Error
Parallel WP2 (System Metrics)
– SYSTEM versus HUMAN performance?
Distinction sometimes arbitrary
– Performance Areas differ
– Implicit links to Goals
human performance influences SAFETY etc
WP 3 Deliverable
ASAS f1
Air
Ground
Wrkld
SA
etc
Obj
Subj
Fast t
Real t
ASAS fn
.
.
Methods
Literature review
- ASAS sources (transatlantic FreeFlight)
- ATM validation efforts (INTEGRA, RHEA,…)
- Aviation Human Factors
- Human Factors general
Distillation of methods and metrics
Guidelines
Including general Measurement Issues
General Measurement Issues
Validation test subjects
Experimental design in validation
Analytical techniques
Balancing validation realism & data collection
Format of Metrics
Function ValidationTechnique
Agent HumanPerformanceArea orConcept
Data Type Metric Comments
What?
Wor
kloa
d
Sub
ject
ive
How?What?
Air
born
e su
rvei
llan
cefu
ncti
ons
Fas
t Tim
e
Airside
Sit
uati
onA
war
enes
s
Obj
ecti
veHow?
Guidance:
Literature Consensus and Expert Opinion...
ValidationTechnique
RTS
Subjective/Objective ObjectiveIntrusive Med-HighCost of Equipment HighReliability HighValidity Med*Expertise Required High
Pupil diameter
Resource Intensity High
Good temporal resolution,but also costly in terms ofexpertise. Equipment for eyetracking expensive andgenerally not portable;Ocular measures currentlyquite intrusive. Subject tolight (eg probably notsuitable for use in daylightcockpit settings) and otherartefacts. (Beatty, 1986)
EXAMPLE
• NASA Task Load Index• Heart Rate Variability• Pupil Diameter• Human Computer Trust Scale• Auditory Choice Secondary Task Response Time• Situation Awareness Rating Technique• Blink latency• Choice Reaction Time secondary task
• etc………...
100+ Human Performance Metrics
Choosing a Metric
– Objectivity Objective / Subjective
– Intrusive High / Low
– Cost High / Low
– Reliability High / Low
– Validity High / Low
– Utility High / Low
– Expertise High / Low
– Resource High / Low
A Worked Example...
• Given: Airborne Spacing application
(in-descent spacing scenario)
Controlled airspace
• Study: Real time
• Question: Transient workload peaks?
• Issues: Time resolution, intrusiveness, cost
Applying the Framework
General Guidance
WorkloadHuman Error
System Monitoring Usability SA Teamwork Trust
Analytic * *** ** ** * ** *Fast Time * *** * ** * ** *Real Time *** * *** *** *** *** *Survey ** * * *** ** * ***
Link between System and Human Performance metrics
What are my goals?
Relationship to higher validation goals
Guidance, not a cookbook
Ideally used in conjunction w SYSTEM metrics
Conclusions