21
Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA 2011 Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator Situational Awareness

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA 2011 Examining Effects of Visual Interaction Methods on Unmanned Aircraft

Embed Size (px)

Citation preview

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator Situational Awareness

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

“Situational Awareness (SA) refers to the degree of accuracy by which one's perception of his current environment mirrors reality.” -United States (US) Naval Aviation Schools Command (NASC)

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Situational Awareness (SA) can be said to reflect the amount of data a user has and is able to interpret at any given moment in time. Therefore, improving a user’s SA, while operating a vehicle, results in an overall improvement in performance, potentially decreases risk to the operational hardware, and increases operational efficiency.

The following represent key factors of the US NASC SA model:

PERCEPTION VERSUS REALITY• View of Situation• Incoming information • Expectations & Biases• Incoming Information versus Expectations

FACTORS THAT REDUCE SITUATIONAL AWARENESS• Insufficient Communication• Fatigue / Stress• Task Overload• Task Underload • Group Mindset • "Press on Regardless" Philosophy• Degraded Operating Conditions *US NASC

Situational Awareness (SA) Overview

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The key factors of SA -Perception versus Reality (i.e. View of situation, Incoming information, Expectations & Biases, and Incoming Information versus Expectations) are observable in unmanned vehicle control actions or events. These key factors of SA directly correspond to potential problems encountered when teleoperating an unmanned vehicle:

• Inaccurate representation of vehicle operating environment (View of Situation)

• Identification of potential threats to vehicle (Expectations & Biases)

• Loss of communication (Incoming information)

• Insufficient data presentation and comprehension (Incoming Info vs. Expectations)

This correspondence between SA –Perception versus Reality and the identifiable problems is a result of the unmanned vehicle/Ground Control Station (GCS) architecture and the control and data display interfaces utilized in current systems.

Situational Awareness (SA) Overview (cont…)

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Recent advancements in the fields of telecommunications, robotics, aerospace, and man-machine interfaces, have led to the increased usage of UAS in situations that would otherwise be hazardous or dangerous to humans. As the flight and operational capabilities of UAS has increased, through the inclusion of high definition cameras and weapon deployment systems, the ability of an operator to sense the surroundings and state of the vehicle has remained the same. Current military aviators believe their ability to perform their duties are being prevented by the usage of such controls and outdated technology (Jean, 2008).

One of the identified human factors issues associated with operating or teleoperating an unmanned aircraft is the human integration and interaction with the “arcane and exhausting pilot-machine interface” (Mola, 2008, p.2). The usage of such interfaces can result in the operator suffering from spatial disorientation, attentional bottlenecking, reduced situational awareness, increased confusion and frustration (Olivares, Zhou, Bodenheimer, & Adams, 2003). The lack of operator situational awareness, which, when coupled with task prioritization, inadequate system knowledge, checklist error, workload inconsistency, fatigue, insufficient training, crew coordination issues, and poor ergonomics, can cause UAS accidents (Herz, 2008; Leduc, et al., 2005).

Identified Problem

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Operational choices made using deficient or inaccurate situational awareness can be presented as an example of a human error since it has been determined that many unmanned aircraft accidents could have been foreseen through an analysis of the user interfaces and their use (Williams, 2004). The United States Air Force (USAF) identified human error as a significant causal factor in over 133 or 60.2% of the 221 UAV mishaps occurring from 1994 to 2003; 24% solely to human error and 36% the combination of human error and mechanical failure (Tvaryanas, Thompson, & Constable, 2005). Based on this series of documented mishaps, the USAF has received a recommendation to evaluate and optimize the human-system interface of the Ground Control Station (GCS) used in the operation of the UAVs (Tvaryanas et al., 2005). Some researchers, such as Taylor, Brown, and Dickson (2003), believe that the enhancement of situational awareness for unmanned aircraft operators is crucial since they are removed from the vehicle’s actual operating environment and therefore require real-time feedback of the state of the vehicle through the implementation of in-the loop systems.

Identified Problem (cont…)

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Purpose of Research

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The primary focus of the planned research is envisioned to center on the identification and determination of visual enhancement methods to increase the situation awareness of a Unmanned Aircraft System (UAS) operator. Concepts, techniques, and components related to visual control from the fields of gaming, simulation, robotics, and man-machine interface, are anticipated to be identified and integrated into a series of test designed to observe their individual effectiveness at performing task associated with or correlating to unmanned aircraft operation. Using quantitative research methodologies and data collection, it is anticipated that the results incorporating visual interaction methods has on situational awareness, both beneficial and adverse, can be determined and a recommendation for implementation or further research criteria can be provided.

Currently four distinct enhancement treatments have been identified for the planned experimental testing; joystick analog axis control, head tracked control, Hat/POV switch uninterrupted control, and Hat/Point of View (POV) switch incremental control. In addition to the enhancement methods, a control treatment that utilizes no eyepoint control, representing the typical system utilized in current UAS control systems, has also been identified.

Purpose of the ResearchOverview

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The research shall attempt to determine through a review of literature and performance of an experiment the following research question:

Q1. To what extent, if any, does the use of a standard visual interaction method (static eyepoint) differ from a dynamic visual interaction method (analog joystick, head tracker, uninterrupted hat/POV switch, and incremental hat/POV switch) in affecting operator SA in an egocentric visual environment?

To answer the research question a pair of null and alternative hypotheses was designed. These constructs establish a framework for testing that may identify if the use of the dynamic visual interaction methods or static visual interaction method affect (alternative hypothesis) or do not affect (null hypothesis) the SA of an operator in an egocentric visual environment.

Null Hypothesis. There is no difference on operator SA between the dynamic visual interaction methods (analog joystick, head tracker, uninterrupted hat/POV switch, and incremental hat/POV switch) and the static visual interaction method (static eyepoint) in an egocentric visual environment.

Alternate Hypothesis. There is a difference on operator SA between the dynamic visual interaction methods (analog joystick, head tracker, uninterrupted hat/POV switch, and incremental hat/POV switch) and the static visual interaction method (static eyepoint) in an egocentric visual environment.

Research Question

Purpose of the Research (cont…)

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Experimental Visual Control Techniques/Methods

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The proposed study will include several independent variables or treatments. Using quantitative research methodologies and data collection, it is anticipated that the results incorporating enhanced visual interaction methods has on situational awareness, both beneficial and adverse, can be determined and a recommendation for implementation or further research criteria can be provided. The following represent the visual control techniques/methods that will be utilized with the research:

• Static Eyepoint (Treatment A)• Joystick (Analog Axis) Control of Visual Eyepoint (Treatment B)• Head Tracker Control of Visual Eyepoint (Treatment C)• Joystick Hat/POV Switch Uninterrupted Control of Visual Eyepoint (Treatment D)• Joystick Hat/POV Switch Incremental Control of Visual Eyepoint (Treatment E)

Experimental Visual Control Techniques/Methods

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The static eyepoint represents the control treatment, as it is the most commonly utilized visual interaction method utilized in current UAS control systems. In this interaction method, the eyepoint is fixed and therefore represents the FOV of the utilized camera system. For the purposes of this research a FOV value of 45 degrees has been selected for commonality between all platforms.

Key

Visible Display Area

Visible Environment Area

Area Outside Visible Environment

Horizontal Camera Environment Vertical Camera Environment

45 deg

67.5 deg

Visible Screen Area

(Centered)

Area Outside Visible

Environment (Left)

Area Outside Visible

Environment (Right)

67.5 deg

45 deg

67.5 deg

Visible Screen Area (Centered)

Area Outside

Visible Env (Down)

Area Outside

Visible Env (Up)

67.5 deg

Static Eyepoint (Treatment Group A)

Experimental Visual Control Techniques/Methods (cont…)

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The analog joystick (analog axis) treatment will focus on controlling the eyepoint of the visual display using a the X and Y axial movement of a joystick device. The joystick movements (pitch/yaw) will be captured and translated into eyepoint (camera) movement through the visible environment area.

Analog Joystick (Analog Axis) Control of Visual Eyepoint (Treatment Group B)

Visible Environment Area

Joystick CenteredJoystick Panned (Right)

Joystick Tilted (fwd)

Joystick Tilted (back)

Joystick Panned

(Left) & Tilted (back)

Joystick Panned

(Left) & Tilte

d (fwd)

Joystick

Panned

(right) &

Tilted

(back)

Joystick Panned

(Right) & Tilted (fwd)

Joystick Panned (Left)

Experimental Visual Control Techniques/Methods (cont…)

Tilt (down)

Tilt (up)

Pan (right)

Pan (left)

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The head tracker treatment will focus on controlling the eyepoint of the visual display using a head tracker device. The head tracker will capture a participant's head movements (pitch/yaw) and translate this movement into eyepoint (camera) movement through the visible environment area. To return the view to center the participant will need to center their head.

Head Tracker Control of Visual Eyepoint (Treatment Group C)

Visible Environment Area

Head CenteredHead Panned

(Right)

Head Tilted

(Down)

Head Tilted (Up)

Head Panned (Left) &

Tilted (Up)

Head Panned

(Left) & Tilte

d

(Down)

Head

Panned

(right) &

Tilted (U

p)

Head Panned

(Right) & Tilted

(Down)

Head Panned (Left)

Experimental Visual Control Techniques/Methods (cont…)

Tilt

Pan

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The joystick Hat/POV switch uninterrupted movement control method is designed to use software to read the state of the switch and send the eyepoint to the maximum bounds of the camera FOV. When the switch is released the eyepoint returns to the center position.

POV Right Max Position

POV Down Max Position

POV Up Max Position

POV Left & UP Max Position

POV Down & Left Max Position

POV Up & Right Max Position

POV Down & Right Max Position

POV Left Max Position

Switch Disengaged

Joystick Hat/POV Switch Uninterrupted Control of Visual Eyepoint (Treatment Group D)

Experimental Visual Control Techniques/Methods (cont…)

Up

Down

Left Right

Up/Right

Right/DownDown/Left

Left/Up

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The joystick Hat/POV switch incremental movement control method is designed to use software to read the state of the switch and integrate the commanded value to the previously calculated value. If the user starts from the center position (X, Y value of 0,0) with an increment value of 15, then applies the up position on the Hat/POV switch, the eyepoint will move 15 degrees upwards every second until either the maximum bound is reached, the user releases the switch, or the switch direction is changed. To return the position to center the view has do be incremented/decremented back to 0,0.

Visible Environment Area

Value Centered

(0,0)Pan

Incremented (Right)

Tilt Incremented

(Down)

Tilt Incremented

(Up)

Pan Incremented

(Left) & Tilt

Incremented (Up)

Pan Incremented (Left)

& Tilt Incremented

(Down)

Pan Incremented (U

p)

& Pan Incremented

(Right)

Pan Incremented

(Right) & Tilt

Incremented (Down)

Pan Incremented (Left)

Joystick Hat/POV Switch Incremental Control of Visual Eyepoint (Treatment Group E)

Experimental Visual Control Techniques/Methods (cont…)

Up

Down

Left Right

Up/Right

Right/DownDown/Left

Left/Up

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Test System Overview

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The test system will be utilized as the test platform in order to determine the usability and effectiveness of enhanced visual control options identified in the research design. The system was designed to reflect generic components available in current GCS operator terminals supplemented with additional testing components that can be enabled/disabled on an individual basis. The test system is comprised of a PC (netbook/laptop), a 32” LCD display, joystick with Hat/POV switch, and head tracker system (Wii remote/IR LED/bluetooth transceiver/clear glasses). This system will utilize a custom developed input processing application and a custom developed testing application for all research testing.

Test System Overview

VGA

Wireless IR

USB

USB

Wire

les

s

Blu

eto

oth

Head Tracker Device

Test PC

32” LCD Test Display

USB Joystick

Bluetooth Adapter

Test System Hardware Overview

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Test System Overview (cont…)

Actual Test System

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

Envisioned Future Research

Examining Effects of Visual Interaction Methods on Unmanned Aircraft Operator SA2011

The following represents potential future research related to increasing the situational awareness of UAS operators:

•The examination and analysis of results while participant is under attentional load (i.e. flying aircraft, navigating database, performing activities associated with operation of aircraft)• The incorporation of spatial representative (three dimensional) audio cues• The utilization of multiple monitors/cameras to depict environment• Multiple camera three-dimensional environmental recreation (processing simultaneously captured visual inputs to account for camera control transmission latency) for single monitor• The examination and analysis of the incorporation of touch sensitive devices (i.e. iPod Touch/iPhone, HP iPaq, Touch Pad, etc…) as control input

Envisioned Future Research