84
SEVENTH FRAMEWORK PROGRAMME System and Actions for VEhicles and transportation hubs to support Disaster Mitigation and Evacuation Grant Agreement No. 234027 DELIVERABLE 8.1 Pilot Plans Workpackage No. WP8 Workpackage Title Pilot Testing Activity No. A8.1 Activity Title Pilot Plans Authors (per company, if more than one company provide it together) G. Evans - UNEW S. Colombetti, F. Tesauri UNIMORE M. Marzoli - CNVVF Status (F: final; D: draft) F File Name SAVE ME D8.1_v2.1.doc Project start date and duration 01 October 2009, 36 Months Co-funded by the European Commission

DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SEVENTH FRAMEWORK PROGRAMME System and Actions for VEhicles and transportation hubs

to support Disaster Mitigation and Evacuation

Grant Agreement No. 234027

DELIVERABLE 8.1

Pilot Plans

Workpackage No. WP8 Workpackage Title Pilot Testing

Activity No. A8.1 Activity Title Pilot Plans

Authors (per company, if more than one company provide it together)

G. Evans - UNEW

S. Colombetti, F. Tesauri – UNIMORE

M. Marzoli - CNVVF

Status (F: final; D: draft) F

File Name SAVE ME D8.1_v2.1.doc

Project start date and duration 01 October 2009, 36 Months

Co-funded by the European Commission

Page 2: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 i UNEW

Version history table Date Version Comments

07/03/2010 1.0 First Deliverable draft. The structure, outline description of the Pilot Sites.

01/08/2010 1.1 Updated structure following the 4th project meeting

19/10/2010 1.2 Further updates following the 5th project meeting

01/12/2010 1.3 Additions and amendments following input from UNIMORE

06/01/2011 1.4 Revisions based upon QIR comments received from CNVVF

20/01/2011 1.5 – final (1st iteration)

Final revisions based upon the consolidated comments from the Quality Manager

18/01/2012 2.0 Updated version following developments with GST and relocation of tunnel pilot site to Colle Capretto, Perugia, Italy

28/03/2012 2.1 Final revisions to updated version in light of technical meeting in Newcastle, March 2012

Page 3: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 ii UNEW

Table of contents

Table of contents ........................................................................................................ ii List of Figures............................................................................................................ iii List of Tables ............................................................................................................. iii List of abbreviations .................................................................................................. iv Executive Summary .................................................................................................. 5 1. Introduction ........................................................................................................ 7 2. Pilot Plan Methodology ...................................................................................... 8 3. Pilot Site Descriptions ...................................................................................... 10

3.1. Monument Metro Station, Newcastle upon Tyne, UK ................................ 10 3.2. Colle Capretto Tunnel, San Gemini, near Perugia, Italy ............................ 15

4. SAVE ME Use Cases, User Needs and Expected Impacts .............................. 18 4.1. Use Cases ................................................................................................ 18 4.2. User Needs ............................................................................................... 18 4.3. Expected Impacts...................................................................................... 19 4.4. List of Use Cases to be Implemented ........................................................ 21

5. Common Evaluation Framework (CEF) and Evaluation Tools ......................... 24 5.1. Technical Metrics ...................................................................................... 25 5.2. User Acceptance (Non-Technical Metrics) ................................................ 31 5.3. Economic Evaluation ................................................................................. 40

6. Laboratory Pilot Tests ...................................................................................... 42 7. Test Procedures, Scenarios and Roadmap for the Pilot Plans ......................... 43

7.1. Pilot Test Administration and Set-up ......................................................... 43 7.2. Newcastle Scenarios ................................................................................. 45 7.3. Colle Capretto Scenarios .......................................................................... 47 7.4. Gantt Chart/Roadmap ............................................................................... 48

8. Conclusion....................................................................................................... 50 9. References ...................................................................................................... 51 Appendix A – Final Pilot Site Questionnaires .......................................................... 52

Page 4: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 iii UNEW

List of Figures

Figure 1: SAVE ME Evaluation Flow line ................................................................... 8 Figure 2: The location of the SAVE ME Pilot Sites .................................................. 10 Figure 3: The Tyne and Wear Metro network map, highlighting the central position of Monument station on the network. ........................................................................... 11 Figure 4: Location of Monument Metro Station in Newcastle City Centre ................ 12 Figure 5: Monument Metro Station Platform Layout................................................. 13 Figure 6: Location Map of the Colle Capretto Tunnel ............................................... 15 Figure 7: Confusion Matrix, showing system predictions and actual events ............. 26 Figure 8: Example of Graph Plot for Van Westendorp’s Price Sensitivity Meter (Farace, 2008) ......................................................................................................... 41

List of Tables

Table 1: Pilot Site Attributes .................................................................................... 17 Table 2: SAVE ME User Needs and Expected Impacts ........................................... 20 Table 3: SAVE ME Uses Cases to be Implemented at Each Pilot Site .................... 23 Table 4: SAVE ME High-Level Evaluation Criteria ................................................... 24 Table 5: Evaluation Components and Partners Responsible ................................... 28 Table 6: SAVE ME Technical Metrics ...................................................................... 30 Table 7: SAVE ME Non-Technical Aspects ............................................................. 36 Table 8: SAVE ME Non-Technical Metrics .............................................................. 39 Table 9: SAVE ME Laboratory Technical Testing .................................................... 42 Table 10: Features to be included in the SAVE ME Pilot Testing ............................ 44 Table 11: Monument Metro Station Scenario Run 1 ................................................ 45 Table 12: Monument Metro Station Scenario Run 2 ................................................ 46 Table 13: Monument Metro Station Scenario Run 3 ................................................ 46 Table 14: Colle Capretto Tunnel Scenario Run 1 .................................................... 47 Table 15: Colle Capretto Tunnel Scenario Run 2 .................................................... 48 Table 16: Colle Capretto Tunnel Scenario Run 3 .................................................... 48

Page 5: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 iv UNEW

List of abbreviations

Abbreviation Definition

CEF Common Evaluation Framework

DSS Decision Support System

SHAPE Solutions for Human Automation Partnerships in European ATM

SUS System Usability Scale

VDLA Van Der Laan Acceptance scale

VWPSM Van Westendorp’s Price Sensitivity Meter

Page 6: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 5 UNEW

Executive Summary

WP8 of SAVE ME is concerned with the Pilot Testing and evaluation of the SAVE ME system. The key objectives in the WP are to co-ordinate the test activities at the two pilot sites (Monument Metro Station, Newcastle upon Tyne, UK; Colle Capretto Tunnel, San Gemini near Perugia, IT) to ensure all tests follow a common model which will collect relevant and feasible data to provide a robust evaluation of the SAVE ME system. This current deliverable, D8.1, presents the final pilot plans for the SAVE ME project. The first iteration of this deliverable was submitted in M14, but given the on-going nature of the work in SAVE ME, further amendments to the Pilot Plans have been required to reflect possible changes, developments and alterations to the technical aspects of the SAVE ME system. This has allowed for the final tests to better reflect the final outputs and capabilities of the system and so this document was kept as a living document until the technical development work in WP4 and 5 had been completed. In light of later developments with the original pilot site at the Gotthard Strassentunnel, the final pilot tests will now take place at the Colle Capretto Tunnel, which is located on the SS3bis Autostrada, part of the E45 Perugia – Terni road, near San Gemini, Italy. This switch has necessitated further changes to the pilot plans. This is now the final version of D8.1. Deliverable Structure The first chapter of this Deliverable (Chapter 1) is the introduction, presenting the general overview of the SAVE ME pilot plans, the aims of the pilot plans and the purpose of this document. An outline schematic of the interrelations of the various preceding work packages is also given. Chapter 2 presents an outline of the proposed methodology flow line which will be adopted for the SAVE ME project. This describes how various inputs from earlier WPs will be used to define evaluation metrics, and the data collection methods which will be applied in the pilot tests. Chapter 3 gives a detailed description of the two SAVE ME pilot sites and their characteristics. The background issues affecting each of the pilot sites are also discussed to illustrate and contextualise how the SAVE ME system can play a beneficial role in the safety of travellers. Chapter 4 provides a summary of the 62 Use Cases and different User Needs identified in WP1 (see D1.1 SAVE ME Application Scenarios and Use Cases for detailed information), and describes different expected impacts of the SAVE ME system components, and their anticipated level of impact. Chapter 5 identifies which of the Use Cases will be implemented in various scenarios during the pilot tests at each of the pilot sites.

Page 7: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 6 UNEW

Chapter 6 gives an overview of the proposed Common Evaluation Framework (CEF) to be adopted for the pilot testing at the two sites outlined in Chapter 8. The CEF is divided into two broad categories, technical evaluation of the underlying technologies and systems, and the non-technical evaluation of human factors and user acceptance. The chapter then goes on to describe the various evaluation metrics that will be used to assess the overall SAVE ME system, and the tools that will be used to collect the different metrics. Chapter 7 provides an outline of the various pre-pilot laboratory based testing activities. These will act as a debugging step before the final pilot tests are carried out, to ensure that any technological errors and bugs are eliminated. Chapter 8 presents an outline of the various test scenarios that will implemented in the pilot testing. The range of scenarios will be developed and strategically selected as the technological development of the SAVE ME system progresses. This will ensure that the final scenarios to be implemented are feasible under the SAVE ME system, yet deliver the required data for a robust evaluation. The final scenarios will be presented in the final iteration of this Deliverable D8.1. Chapter 9 gives the concluding remarks of the Deliverable.

Page 8: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 7 UNEW

1. Introduction

WP8 ‘Pilot Testing’ is concerned with the co-ordination of the SAVE ME on-site tests conducted at the two pilot sites: Monument Metro Station, Newcastle, UK, and the Colle Capretto Tunnel, San Gemini, Italy. The aim of the Pilot Tests is to: Apply and evaluate the SAVE ME system, its modules and the Decision Support System (DSS), ensuring that the tests conducted at both sites return relevant and feasible data This will be achieved through the following tasks:

Define a set of evaluation metrics, guided by the requirements of the SAVE ME Use Cases and User Needs;

Develop common data gathering and analysis tools, through the establishment of a common evaluation framework (CEF);

Develop a data collection methodology to be used to assess both the technical and user (i.e. non-technical) aspects of SAVE ME.

Outputs from WP8 include an assessment of the SAVE ME system in terms of its technical performance and reliability, and the views and opinions of users regarding the system in terms of user acceptance, usability, market viability, economic benefits and other appropriate metrics. All of these measures will be used to provide system optimisation recommendations and future contributions to policy and practice guidelines for safety in transport infrastructures. Purpose of this Document The aim of this document is to present the SAVE ME Pilot Plans and the various metrics which will be used to evaluate the performance and user opinions of the SAVE ME system, which will be used to guide the overall testing of the SAVE ME system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and management system, technical components and the DSS. Earlier WPs (2, 3 and 4) define and develop the underlying system architectures, ontological frameworks, algorithms and detection systems, all of which feed in WP5 (Decision Support System). WP6 and WP7 develop the emergency support and user training measures, and all of the above are evaluated throughout WP8, with the results feeding into WP9 (Dissemination). Therefore, this particular WP is a crucial link for the SAVE ME project. The pilot plans will be designed to focus on the key areas of innovation within the SAVE ME project, these being:

User localisation in tunnels, terminals and hubs using combinations of different sensor technologies (WP4) and existing wireless and mobile technologies

Dynamic monitoring of position and movement of people and vehicles (WP4)

Personalised route guidance via mobile technologies, and generic route guidance for those without mobile technologies (WP6)

The DSS for guiding emergency support teams (WP5)

Page 9: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 8 UNEW

2. Pilot Plan Methodology

The following schematic diagram (Figure 1) represents the flow line underpinning the evaluation framework, to be implemented for the assessment of SAVE ME:

Figure 1: SAVE ME Evaluation Flow line

Starting at the top left section of the flow line, the initial definition of the various non-technical (user) and technical needs which will be addressed in SAVE ME begins with an evaluation of earlier WPs. More specifically, this will be a synthesis of the key findings from two input documents.

Page 10: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 9 UNEW

Moving to the right hand side of the flow line, these needs will ultimately be used to derive the evaluation metrics (technical and non-technical). WP1 will focus on the definition of a range of Use Cases (UCs) and User Needs (UNs) which will be used to inform the non-technical metrics, whilst WP2 will define the sensors and modules to be used in the development of the SAVE ME system and thus be used in the technical metric definition. However, of these two inputs, the various Use Cases defined in WP1 can be applied to help define many of the requirements and metrics for both the non-technical and technical metrics, so will be used as the primary source of information. More detailed information can be found in the relevant deliverables: D1.1 – SAVE ME Applications, Scenarios and Use Cases; D2.1 – System Architecture, Ontological Framework and Modules Specification. After the specification of the various metrics to be used to evaluate SAVE ME, the next step in the flow line will be to identify the most appropriate tools that can be adopted for the measurement and analysis of both the technical and non-technical metrics. From this, a Common Evaluation Framework (CEF) will be developed to ensure that tests and results from both final pilot sites (Newcastle and Colle Capretto) will be gathered in a similar manner. The CEF (Chapter 5) will include a roadmap for the integration of the various SAVE ME system components followed by an implementation plan of the testing and evaluation programme. These will ultimately form the final Pilot Plans, which includes this current Deliverable (D8.1). After this, the first stage of the implementation and testing in a laboratory environment can begin. This is described in greater detail in Chapter 5.3, but the primary focus of the laboratory testing is to ensure the SAVE ME technical system is free from software bugs and other errors, plus initial user/stakeholder engagement and pre-assessment of the plans. Towards the end of the project (M32-33), the Pilot Plans will be realised at the two pilot sites in Newcastle and Colle Capretto. In the first iteration of D8.1, a series of possible outline scenarios were presented, and from this the final scenarios to be used in the testing are defined in this version of the deliverable. This is necessary to allow for the final SAVE ME system to be defined in full and for engagement and consultation with the pilot sites to take place. All of this will ensure that the scenarios which are eventually agreed upon can actually be delivered within the scope of the project. The scenarios are presented in Chapter 7. The final step of the flow line is the consolidation of all pilot results which will be based upon the tests carried out at the two pilot sites. These results and other information obtained during the testing will be used to inform dissemination and exploitation activities in WP9.

Page 11: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 10 UNEW

3. Pilot Site Descriptions

At the end of the project, the full SAVE ME system will be realized in two real world scenarios. Figure 2 shows the location of each pilot site and the attributes of these pilot sites are described in greater detail in this section.

Figure 2: The location of the SAVE ME Pilot Sites

3.1. Monument Metro Station, Newcastle upon Tyne, UK The Tyne and Wear Metro is a light rail rapid transit system serving the five boroughs of Tyne and Wear. The system plays a crucial role in the transport services in the region, with 40 million passenger journeys being made in 2007-08. The original system opened in 1980, using a combination of existing, disused heavy rail alignments and new sections of underground tunnels underneath Newcastle and Gateshead. The extension to Sunderland opened in 2002, and is unique in the UK as the light rail trains share the route between Pelaw Junction and Sunderland station with heavy rail trains on the National Rail network.

Page 12: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 11 UNEW

There are 60 stations on the network, all are distinct stations (as opposed to on-street platforms, or stops integrated into their surroundings, as found on tram systems), consisting of full-height platforms. Stations are a mixture of former main line railway stations and purpose-built stations constructed during the conversion of the system to light rail operation. All stations have ticket machines, shelters, information displays, next-train indicators and passenger information/emergency help-points. The majority of stations are overground, but a number in central Newcastle and Gateshead are underground (Haymarket, Monument, Central, St. James, Manors and Gateshead), as well as Park Lane in Sunderland. Sunderland station was rebuilt in 1965 with the station building covering the platforms, effectively making this station an underground facility as well. All overground stations are unstaffed (except for South Gosforth which hosts the Network Control Centre), whilst all the underground stations must be staffed by law. The metro system is managed by Nexus (formerly the Tyne and Wear Passenger Transport Executive) on behalf of the Tyne and Wear Passenger Transport Authority. As part of the ongoing Metro modernisation programme, from April 1st 2010 service operations, rolling stock maintenance and modernisation is provided by DB Regio, whilst Nexus retain control of operating hours, service frequency and fare levels. This new arrangement will not have any impact upon the use of Monument as a pilot site for SAVE ME. The map of the system is shown below. Monument station is located in the middle of Newcastle City Centre, at the heart of the Metro network (shown by the red oval):

Figure 3: The Tyne and Wear Metro network map, highlighting the central position of Monument station on the network.

Monument Station Monument Station will be used as the pilot site for the SAVE ME project. The underground station is located in the centre of Newcastle City Centre, underneath Grey’s Monument (see map, Figure 4):

Page 13: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 12 UNEW

Figure 4: Location of Monument Metro Station in Newcastle City Centre

Monument station is a major hub in the region’s transport network as it is served by the Green metro line and both sections (north/south and east/west) of the Yellow metro line, has a number of bus stops located on nearby Blackett Street, Pilgrim Street and Market Street, and is a short walk to Eldon Square and Haymarket bus stations. It also serves the main shopping areas of Newcastle, being in close proximity to the pedestrianised areas of Northumberland Street, Grey Street and Grainger Street, as well as the Eldon Square and Monument Mall shopping complexes. The station also is close to St. James’ Park, home of Newcastle United FC, and although the stadium does have its own metro station (at the terminus of the Yellow line), many people travelling from the South Gosforth, Gateshead and South Shields directions will alight at Monument and then walk to the stadium to avoid the interchange for the one-stop journey to St. James. This means that the station can be particularly busy on Saturday afternoons when many shoppers and football fans pass through the station. Average weekly patronage through the station is in the region of 113,500 people (based on figures supplied by Nexus). Monument is unique to the Metro system, as the Yellow line passes through it twice, once between St. James and Manors in an East/West direction, then again between Haymarket and Central Station in a North/South direction. This means it is also a busy interchange station, with passengers transferring between the different sections of the Yellow line and onto other modes of transport. Overall, Monument is the busiest station on the Metro network: in 2008/09, there were nearly six million recorded passengers using the station (Nexus, 2009).

Page 14: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 13 UNEW

The station layout is also unique to the system as it has four underground platforms spread across two different levels, each level having a link to the concourse but also between each other, as shown by the following schematic:

Ground Street Level

-1 Concourse Level (Tickets, Travel Information Centre, Kiosks) ↓ ↓

↓ -2 Platforms 3&4 (Yellow Line [E/W]

St. James to Manors)

-3 (Connecting stairs) →

Platforms 1&2 (Green Line and Yellow Line [N/S] Haymarket to

Central)

The layout of each platform level, exit via escalator to the concourse level plus the connection stairs between the platforms is shown in Figure 5 below.

Figure 5: Monument Metro Station Platform Layout

As both exit points from Platforms 3 & 4 are towards one end of the platforms, whilst on Platforms 1 & 2 there is an exit in the middle of the platforms and one at the end, it has been decided to focus the attention of the pilot plans on Platforms 1 & 2. This provides a greater variety of exit routes (shown by the arrowed lines) compared to those available from Platforms 3 & 4 as a passenger located at Point A on Platforms 1 & 2 would have two exit routes to choose from in an emergency, compared to Platforms 3 & 4 where the majority of passengers would rely on using the escalator exit option. As Platforms 3 & 4 are above Platforms 1 & 2, the connecting stairs exit from Platforms 3 & 4 would require people to head downwards i.e. further away from the concourse level and overall safety.

A

Page 15: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 14 UNEW

Once at the concourse level, access to Monument Station from the street level is by stairs from Blackett Street and Grey Street. An at-grade entrance also exists to/from the adjacent Monument Mall, with an additional exit (by stairs and lift) into the Eldon Square shopping centre. The station itself hosts one of the Nexus Travel Centres and is manned (all underground stations in the UK must be manned, a legal requirement following the fatal Kings Cross Fire in 1987).

Safety and Security on the Tyne and Wear Metro

Nexus have developed an Emergency Preparedness Manual (EPM) which details a number of potential emergency scenarios and the actions required by different sections of Nexus personnel should such a scenario arise. In addition to this, responsibilities and actions required by external organisations (e.g. the emergency services) are also provided in the EPM.

A Contingency Plan has also been compiled by Nexus, in co-operation and consultation with Northumbria Police, Tyne and Wear Fire and Rescue Service, North East Ambulance Service, the Health and Safety Executive, Tyne and Wear Emergency Planning Unit, and Network Rail – (who operate along adjacent tracks at some locations, are fully responsible for rail infrastructure on the route between Pelaw-Sunderland). The Contingency Plan has been written in response to the requirements laid on the Passenger Transport Authority by the provisions of the Fennell Report, although there is of course no suggestion implicit in this of any increased risk of an accident on the Metro system. The Plan is however an additional level of preparation should such an event occur, the aim of which is to enable speedy mobilisation of any resources required to deal with an incident, to provide communications and command at the incident, and co-ordinate the activities of the Emergency Services deployed. Upgraded Emergency and Evacuation Communications for the Metros Extension to Sunderland The section between Pelaw and Sunderland is operated by both Metro trains and heavy rail trains. Team Telecom was employed to deliver a Retail Telecoms system providing 600Mbit/s data transmission between the Metro control centre and the new 12 stations, 2 of which are sub-surface via an optic fibre ring laid by the track side. Also provided by Team Telecom were digital CCTV, Passenger Information Systems and Help Points on all stations, new Ticket Machines, Data and telephony to travel centres, Evacuation Public Address Systems (PAVA) and Positive Train Identification (PTI). Communication Services and Facilities The Metro Control Centre at South Gosforth has two-way radio contact maintained with the driver-only operated trains. In September 2007 a new central system control desk for all signalling and communications was installed to replace the original equipment. Magnetic track circuits operate fixed-colour light signalling, generally three-aspect in tunnels and two-aspect on surface lines. A train identification and control system carries information from on-board transponders to track-level equipment, which operates the points and station information systems. T&W Metro uses a train-stop system based on the Indusi signalling system used by German and Austrian railways.

Page 16: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 15 UNEW

Hi-res digital CCTV cameras are deployed throughout the network and cover all stations for security monitoring and safety. Smoke and chemical sensors are deployed throughout the underground facilities to detect any abnormal conditions which may develop into a more serious situation. Stations have passenger alarm points which are monitored by the hi-resolution digital CCTV system. The Metro was the first light rail system in the UK to offer complete GSM mobile phone coverage, including on all underground sections and stations. All mobile network providers are currently working on a new antenna system that will provide continuous coverage through the central underground part of the Metro including all tunnels, platforms and concourses.

3.2. Colle Capretto Tunnel, San Gemini, near Perugia, Italy The Colle Capretto tunnel is 1,171m in length and is located near San Gemini in Italy forming part of the SS3bis Autostrada (which is part of the E45 TEN route). It is a dual-bore construction tunnel, with traffic running in a single direction through each bore. The tunnel is monitored from a control room located in Perugia (about 70km to the north), not at the tunnel site itself.

Figure 6: Location Map of the Colle Capretto Tunnel

Traffic levels (in 2007) were around 10,000 vehicles per day, with 30% of this volume comprising HGVs, and there are no limitations on the movement of hazardous goods through the tunnel. (EuroTest, 2007). Originally completed and opened in 1974, the Colle Capretto tunnel was recently refurbished in 2010 as the existing safety equipment, lighting and ventilation systems of both tunnels were rather dated, with a low efficiency level. Moreover there were no modern safety fittings and no communication and monitoring facilities linked to any remote control station. The findings of the independent EuroTest 2007 survey rated Colle Capretto as a low risk tunnel with a very poor score for safety features.

Page 17: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 16 UNEW

The upgrading of the tunnel fell within the scope of European Directive 2004/54/CE on minimum safety requirements for tunnels in the Trans-European Road Network, transposed in Italy by the Legislative Decree 264/06 that provides for the upgrading of all Italian TERN tunnels by 2019 (TEN-T EA, 2011). This action aims at upgrading the tunnels in order to guarantee road users a higher level of service, comfort and safety. More in detail, the upgrading consist of the following activities:

Replacement of the lighting systems

Upgrading of the ventilation systems

Placing of new emergency stations, monitoring cameras, lane control signals, variable messages signs and fire detection systems

Lighting and pressurisation of evacuation and escape routes

Construction of a new drainage system

White painting of the inner walls of the tunnels

Safety document on the measures to be adopted in case of emergency and the systems installed

The main safety characteristics of the refurbished Colle Capretto tunnel can be summarised as follows: Traffic Rules:

90kph speed limit

No overtaking, U-turns or stopping Emergency facilities:

There is one emergency exit located approximately halfway along the tunnel, providing access into a pressurised chamber which links the two tunnel bores. This serves as an additional escape and rescue route in addition to the two tunnel portals

o Upon opening the door into the chamber from one tunnel, the VMS systems will immediately stop the traffic entering the opposite tunnel

Rescue service vehicles can cross over at the portals

There are no lay-bys or emergency refuge lanes

Hi-visibility signage and VMS at the tunnel portals to inform travellers about tunnel conditions

Emergency phones and fire extinguishers are located every 300m

Automatic fire alarm system, in the event of fire, fire ventilation is automatically activated, the tunnel closed and the fire brigade is notified

Air-flow monitoring and ventilation fans powerful enough to deal with a fire and high smoke levels

The entire tunnel (i.e. both bores) is monitored by a network of CCTV cameras and environmental sensors (air flow, CO2, NOx etc.) and all information is relayed back to the control centre in Perugia. A dedicated software interface is used by the control centre to monitor the conditions and take necessary steps to avoid high levels of pollutants building up in the tunnels, manage traffic flows and inform drivers of road conditions. Unlike the Gotthard Tunnel, the shorter length and twin bore nature of the Colle Capretto tunnel allows for one bore to be temporarily closed (for maintenance, training for rescue operations etc.) and traffic diverted to run both ways on a single lane basis through the other bore, without causing major disruptions to the overall traffic conditions. This flexibility afforded by twin bore tunnels will be utilised for the SAVE ME trials.

Page 18: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 17 UNEW

The table below shows the attributes of each Pilot Site:

Pilot Site Colle Capretto Tunnel Monument Metro Station

Overseeing Partner

CNVVF and IES UNEW

Site Contacts

Marcello Marzoli [email protected]

Uberto Delprato [email protected]

Gareth Evans [email protected]

Available Infrastructure

Tunnel and Control Room Metro Area (Platforms)

Estimated number and type of users

Professionals: 3 operators & 6 rescuers;

Users: 20 “average” users / 10 children / 10 elderly and 5 tourists

(those with language barriers).

Professionals: 3 operators & 6 rescuers;

Users: 20 “average” users, 10 elderly, 10 children and 5

disabled (blind and physical impairments).

Realisation Period

M32-33 M33

Table 1: Pilot Site Attributes

Page 19: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 18 UNEW

4. SAVE ME Use Cases, User Needs and Expected Impacts

In WP1, detailed analysis of different potential Use Cases (UC) and User Needs (UN) was conducted, and these will be used to guide the Pilot Plans.

4.1. Use Cases In D1.1 (SAVE ME Application Scenarios and Use Cases), there were 12 general categories of UC defined:

1. Crowd simulation

2. Emergency detection

3. Localisation

4. Telecommunication

5. Decision support system

6. Operator support

7. Individual guidance to travellers

8. Collective herding guidance

9. Guidance to rescue units

10. Infrastructure operator training

11. Emergency team training

12. General public training

Under each general category, a set of 62 individual UCs were then defined. From this

set of UCs, the SAVE ME pilot tests will address the 37 UCs deemed ‘essential’ and

where possible, incorporate the 17 ‘secondary’ UCs. The 8 ‘supportive’ UCs will be

incorporated only if there is no other sub-UC tested within the general UC category.

4.2. User Needs

WP1 also undertook a detailed analysis of the different environments, events and user group needs associated with transport disasters and emergencies. The analysis focussed on five main transport disaster scenarios which will be used to inform the final scenarios for the testing:

Natural – Geological – Earthquakes

Artificial – Unintentional – Structural Failure

Artificial – Unintentional – Transport crash/collision

Artificial – Unintentional – Fire

Artificial – Intentional – Terrorism/Sabotage

A literature search and review identified a diverse range of User Needs (UNs), and a Focus Group was conducted in Newcastle (UK) to help identify key user issues, needs and solutions in transport emergencies. All are fully defined and described in D1.1, and have been organised into five general categories:

Page 20: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 19 UNEW

HMI design/development

Technical development

Information needs

Emergency responders

Travellers

4.3. Expected Impacts

The table below summarises these UNs, along with the level of expected impact provided by the SAVE ME system and innovations:

UN

Category Feature

Target

Group

SAVE ME

Application(s)

Expected

Impact

HMI

Development

Human

Factors All users

Evacuation guidance

through signage and

mobile devices

++

Map

Parameters All Users

Map interface on

mobile devices 0

Technical

Development

Disaster

Evacuation

through

Mobile

Devices

All Travellers

WSN localisation of

travellers;

Evacuation guidance

through mobile devices

+

Information

Needs

Information for

Rescuers

Rescue

Personnel

WSN environmental

detection;

WSN localisation of

travellers;

Ontology of disasters;

Rescue guidance and

support through DSS

and PDAs;

Training

++

Information for

Travellers

Adult

Travellers

Evacuation guidance

through mobile

devices; Training

+

Vulnerable

Users

Personalised

evacuation guidance

through mobile devices

+

Information in

Tunnels All Travellers

Evacuation guidance;

Training +

Information

Management

Systems

Infrastructure

Operators;

Rescue

Personnel

Ontology of disasters;

DSS ++

Page 21: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 20 UNEW

UN

Category Feature

Target

Group

SAVE ME

Application(s)

Expected

Impact

Emergency

Responders

Emergency

Co-ordination

Centre

Rescue

Personnel

WSN environmental

detection;

WSN localisation of

travellers;

Ontology of disasters;

DSS

+

Travellers

Crowd

Behaviour All Travellers

Collective herding

guidance;

Personalised

evacuation guidance

through mobile

devices;

Training

+

Behaviour in

Tunnels

(Confined

Spaces)

All Travellers

WSN localisation of

travellers;

Personalised

evacuation guidance

through mobile

devices; Training

0

Evacuation of

Older and

Mobility

Impaired

Users

Vulnerable

Users

WSN localisation of

travellers;

Personalised

evacuation guidance

through mobile

devices;

Rescuer guidance to

reach vulnerable users;

+

Table 2: SAVE ME User Needs and Expected Impacts

Expected impact:

++ very positive; + positive; 0 neutral; - negative; - - very negative

Page 22: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 21 UNEW

4.4. List of Use Cases to be Implemented The following table outlines which of the UCs will be implemented in the tests undertaken at each pilot site (Ess – Essential; Sec – Secondary; Sup – Supportive):

Use cases Category Newcastle C.Capretto

1. System administration

1.1 User profile creation (for personalization of service)

Ess

1.2 Registration and login (for infrastructure operator and for rescue team member)

Sec

1.3 Unregistration and logout (as above) Sec

1.4 Monitoring of system operation and maintenance

Sec

1.5 Adding/deleting sensors and functions Ess

2. Crowd simulation

2.1 Real-time data fusion to the simulation Ess

2.2 Behavioural habits of vulnerable users in catastrophes (different UCs for different user groups, e.g. children, elderly, etc.)

Ess

2.3 Effect of DSS info and guidance to the crowd movements

Ess

3. Emergency detection

3.1 Emergency event detection

3.1.1 Emergency event detection – Traffic Incident

Sec

3.1.2 Emergency event detection – Traffic Accident

Sec

3.1.3 Emergency event detection – Fire Ess

3.1.4 Emergency event detection – Smoke Ess

3.1.5 Emergency event detection – Shot Sup

3.1.6 Emergency event detection – Explosion (Gas)

Sec

3.1.7 Emergency event detection – Explosion (Solid)

Sec

3.1.8 Emergency event detection – Liquid Leakage

Sup

3.1.9 Emergency event detection – Gas Leakage

Sup

3.1.10 Emergency event detection – Earthquake

Ess

3.1.11 Emergency event detection – Flood Ess

3.2 Person detection

3.2.1 Emergency person detection – Person movement

Sec

3.2.2 Emergency person detection – Person counting through a gate

Sec

Page 23: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 22 UNEW

Use cases Category Newcastle C.Capretto

3.2.3 Emergency person detection – Person counting in a limited closed space

Sec

3.2.4 Emergency person detection – Person detection under ruins

Sup

3.2.5 Emergency person detection – Person localisation in a limited closed space

Ess

3.3 Vehicle detection

3.3.1 Emergency vehicle detection – Vehicle Counting

Sup

4. Localisation

4.1 Vehicles and people localization in a tunnel/underground station

Ess

4.2 People localisation in a vehicle Ess

5. Decision support system

5.1 Routing for optimal evacuation on a group-wise manner

Ess

5.2 Personalized routing for trapped travellers Ess

5.3 Personalized routing for rescue teams Ess

5.4 Automatic reconfiguration of network in case of communication network loss

Ess

6. Operator support

6.1 Info on the type of incident

6.1.1 Info on the type of incident - Informative Ess

6.1.2 Info on the type of incident - Cautionary Ess

6.1.3 Info on the type of incident - Alerting Ess

6.2 Info on the affected area Ess

6.3 Next steps and imminent actions Ess

6.4 Communication with the service centre in case of disaster

Ess

6.5 Communication with the emergency teams Ess

6.6 Communication with third parties Sup

6.7 Manage and store real-time info, through communication with DSS

Sec

7. Individual guidance to travellers

7.1 Communication with the crowd in case of disaster, through mobile phones

Ess

7.2 Personalised information on the emergency Ess

7.3 Personal evacuation guidance (according to needs and preferences)

Ess

8. Collective herding guidance

8.1 Communication with the crowd in case of disaster with the collective herding system

Ess

8.2 Emergency notification to non-involved users Sup

8.3 Evacuation guidance Ess

9. Guidance to rescue units

Page 24: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 23 UNEW

Use cases Category Newcastle C.Capretto

9.1 Communication with and among the rescue team in case of disaster

Ess

9.2 Compass function to guide the rescue team to the disaster area

Sec

9.3 Localisation function for the rescue team Sec

9.4 Priority guidance to individual travellers trapped in the area

Ess

9.5 Send Alert to Emergency Centre Ess

10. Infrastructure operator training

10.1 SAVE ME system installation simulation training through VR platform

Ess

10.2 Realistic emergency simulation training through VR platform

Ess

10.3 Location-based announcement displays training through VR platform

Sec

10.4 PDA use training through VR platform Sec

10.5 Communication and co-ordination training through VR platform (between infr. operators and other key players)

Sec

11. Emergency team training

11.1 Use of the supporting and alerting devices through VR platform

Sec

11.2 Communication and co-ordination training through VR platform (between emergency team and other key players)

Ess

12. General public training

12.1 Training on the use of SAVE ME application on the mobile phone

Ess

12.2 Explanation on the guidance messages Sup

12.3 Personalised training Ess

12.4 System limitations Ess

Table 3: SAVE ME Uses Cases to be Implemented at Each Pilot Site

These UCs will be combined into various scenarios to be carried out at each Pilot Site. These scenarios will be developed in consultation with the representatives of each Pilot Site, based on initial outline scenarios defined in the previous iterations of this deliverable. These scenarios are described in Chapter 7.

Page 25: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 24 UNEW

5. Common Evaluation Framework (CEF) and Evaluation Tools

This Chapter provides an overview of the proposed Common Evaluation Framework to be adopted for the pilot testing at the two sites outlined in Chapter 3. This chapter is divided into two categories, technical evaluation of the underlying technologies and systems, and the non-technical evaluation of human factors and user acceptance. There are many possible approaches to evaluation of such systems (e.g. production of system log files and performance charts for technical metrics; questionnaires and Focus Groups for user opinion/non-technical metrics), and within each approach there can be different sub-approaches to provide highly detailed assessment of specific metrics. For SAVE ME, it will be important to assess the technical aspects of the system’s performance, but also consider the user aspects e.g. ease of use, confidence in the system, willingness to have etc. Therefore, the Common Evaluation Framework (CEF) will be designed to encompass the following:

Evaluation Criteria o Reliability o Functionality o Application performance o System performance

Confidence Indicators

Usability Issues

CBA/CEA & Market Viability (WTH/WTP)

Thresholds for Safety Impacts

Evaluation tools to be used for both pilot sites

In the SAVE ME Technical Annex, the following high-level evaluation criteria are given:

Criterion Definition

Availability Both pilot site tests performed

Reliability All modules performed as specified at the tests.

Reliability of the system >90%

Effectiveness DSS judged as enhancing the effectiveness of the operation by at least

20% by 5 internal and 5 external experts

Response Time and Efficiency

Total response time of rescue team is improved by at least 15% compared to pre-test scenarios

Usability A mean score of >7 (on a 0-10 scale) for each project module by all users

who took part in the testing

Market Viability +ve WTH and +ve CBA for the overall system by pilot sites and 4

additional external experts

Accessibility and Inclusiveness

>80% of vulnerable users in the pilot tests being evacuated through the successful use of the system, therefore “having been saved”

Table 4: SAVE ME High-Level Evaluation Criteria

These high-level criteria will be addressed through various means, for both technical and non-technical metrics, as defined in this chapter.

Page 26: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 25 UNEW

5.1. Technical Metrics The technical evaluation should be clearly defined in such a way that it allows for a robust evaluation of the system. This is done by defining a range of metrics which enable the assessment of the overarching SAVE ME objectives, and to what extent these are achieved. These metrics need to be defined in order to enable the measurement of the reliability and effectiveness of the components in the SAVE ME system, and assess how the system performs. Indicators for the SAVE ME technical evaluation will therefore be classified into 3 different types: Reliability, Application Performance and System Performance. Reliability: A system can be defined as ‘reliable’ if it performs as expected, and in the same way, under the same conditions regardless of when or where the test is performed. It must not fail in unexpected situations or environments. A system is not reliable if the outcome of runs with identical input parameters is not consistent between individual runs. Application Performance: SAVE ME will develop guidance applications for both rescuers and travellers, which will be implemented number of different mobile platforms. Application Performance will be measured by assessing how the different applications run on each mobile platform, and whether users can actually use the application correctly. System Performance: The core of the SAVE ME technological system is a series of WSNs and the DSS, coupled by a failsafe telecommunications infrastructure. The performance of the system will therefore be measured by the accuracy of environmental and localisation sensors; the latency in the time taken for correct data to be passed out from the WSN components to the DSS, the time taken to process this data and determine the mitigation strategies, and return this information to the travellers. System effectiveness and efficiency can be determined. Reliability-based Metrics The reliability of a system can be defined as the probability that a component, module or sub-system of a complete system will work correctly over a given period of time under a given set of operating conditions. Applying the concept of Receiver Operator Characteristics (ROCs) and using a “confusion matrix”, a reliable system is one which will return a high proportion of true positives (TP, correct action when an event is detected and mitigation needed) and true negatives (TN, no action where no event detected and no mitigation necessary) and a low proportion of false positives (FP, a false alarm where action is requested in the event of non-emergency situation; statistically defined as a Type I Error) and false negatives (FN, alarm not activated in an emergency situation when mitigation action is required; statistically defined as a Type II Error). TP, TN, FP and FN can be defined per unit distance, per unit time or per event. The relationship between an actual event and the prediction is shown in the following confusion matrix:

Page 27: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 26 UNEW

Actual Value

p n total

System Prediction

p' True

Positive False

Positive P'

n' False

Negative True

Negative N'

Total P N

Figure 7: Confusion Matrix, showing system predictions and actual events

From this confusion matrix, a range of metrics can be determined for each component of the SAVE ME system (Fawcett, 2006): Sensitivity or true positive rate (TPR) = [TP / P] = [TP / (TP + FN)] False positive rate (FPR) = [FP / N] = [FP / (FP + TN)] Accuracy (ACC) = [(TP + TN) / (P + N)] Specificity (SPC) or True Negative Rate = [TN / N] = [TN / (FP + TN)] = [1 – FPR] Positive predictive value (PPV) or Precision = [TP / (TP + FP)] Negative predictive value (NPV) = [TN / (TN + FN)] False discovery rate (FDR) = [FP / (FP + TP)] Matthews’ correlation coefficient (MCC), a correlation coefficient between the observed and predicted binary classifications, also known as the phi co-efficient

For SAVE ME, given the importance of the system outputs and the potential implications on human injury and even loss of life, it will be crucial that the system delivers a reliable decision, where a clear distinction is made between true and false actions based upon the correct detection of events. For example, the false detection (FP) of a fire could result in an unnecessary evacuation with the only negative result being the inconvenience to those involved and the cost to the emergency services and transport service providers of the disruption. At the other end of the scale, the non-detection (FN) of a fire could lead to injury or even loss of life, if it is not quickly detected by other non-technical means, and thus has severe consequences. The reasons why a FP or FN decision has been returned by the system will need to be investigated and logged for future reference. It may be a case of interrupted power supply, non-smoke particulates interfering or activating smoke/fire sensors or other external issues (e.g. vandalism) which are not directly linked to the underlying hardware and software developed for SAVE ME. However, if there are repeat occurrences of FP or FN decisions which cannot be traced to a non-system fault, it will be imperative that these faults are rectified as soon as is practically possible.

Page 28: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 27 UNEW

Performance Metrics The performance of the SAVE ME application and system can be defined as the level at which they operate under emergency conditions. There will be a baseline at which the system operates during non-emergency events, however, in an emergency a ‘good’ performance will result from the system operating as expected (and it is likely that users will not ‘notice’ the system in operation during their evacuation), whereas a ‘bad’ performance will result from the system operating erratically, unexpectedly or in the worst case, malfunctioning to a state of total failure. In this case, users will be very aware of the poor performance of the SAVE ME system. For SAVE ME, the performance can be measured in a number of ways. Accuracy is of crucial importance to SAVE ME, both in terms of the level of positioning of people within the transport infrastructure, and for the detection and verification of an emergency event (e.g. “Is there definitely an explosion located at the south end of Platform 2?”) The need for accuracy was shown by the London 7/7 terrorist attacks, where explosions on the London Underground were initially thought to be track power surges. A more accurate understanding the situation could have potentially reduced the impact of the terrorist attacks. Latency is a significant metric for the SAVE ME system. For any system involved with the safety and security of humans, the time taken to identify the event and implement the appropriate mitigation strategies is essential. The whole system must be able to execute all required tasks within a strict time limit, which for SAVE ME includes the following:

The emergency event needs to be detected and localised

The localisation of people within the infrastructure needs to be computed

All information passed out to the DSS

The mitigation strategy is determined by the DSS

Where relevant, evacuation information passed back to users and guidance information provided to rescue personnel

In SAVE ME, latency throughout the entire system will be an important consideration, for example, there is little value in the system’s Environmental Sensor module being able to quickly identify an emergency event if the DSS takes too long to receive and process this information and return mitigation information to users. However, as there are a number of modules involved in the detection and decision processes, it may be a complex task to identify which section(s) of the overall system architecture are responsible for the lag in latency, if system performance does not meet the predefined requirements (from the Use Cases). Also, there may be potential conflicts between the localisation programme and the mitigation/evacuation programme running on a user’s device, and between the various WSNs (Environmental, Localisation, Telecommunications) depending on the frequency and communications protocols used (Zigbee, Bluetooth, WiFi etc.) Availability/Robustness of the system can be defined as the proportion of time the SAVE ME system is operational, in particular during an emergency event. The system architecture is designed in such a way to incorporate resilience and fault-tolerant characteristics; however, it is not possible to build a system that will be 100% available, 100% of the time. The availability will be measured by a continuous monitoring of the system’s detection components during the pilot tests.

Page 29: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 28 UNEW

Measurements will be needed in order to calculate and evaluate the accuracy of the different technical metrics. Two kinds of reference frames are needed:

Spatial o A precise spatial reference frame is needed for accurate localisation of

events and people within a transport infrastructure

Temporal o A precise temporal reference frame is needed in order to establish a

common time between components and to evaluate the response and performance of the system. A means of defining a start point upon detection of an emergency needs to be incorporated into the process, to be enable the duration of a critical situation to be established for future reference.

The production and output of technical metrics will be the responsibility of those partners who are responsible for development of the various SAVE ME system modules and components, as shown in the table below:

SAVE ME Component Partner(s) Responsible

Environmental Sensors MIZAR and CERTH/ITI

Localisation Sensors UNEW

Telecommunications UPM

DSS CERTH/ITI

Simulation Output SIMUDYNE

Operator Support Unit USTUTT

PDA-based Guidance for Rescuers MIZAR

Mobile-based Guidance for Travellers CERTH/HIT

Collective Herding Devices MIZAR

Table 5: Evaluation Components and Partners Responsible

Testing and Development Methods A series of laboratory-based testing of the various components in the SAVE ME architecture will be required before real-world testing. These will be performed on a stand-alone basis, with iterative development to expand the functionality of the components to a stepwise procedure. Full-scale integration and pre-pilot testing will be the mid-point of the Pilot Tests, where components are brought together and any errors/conflicts identified and debugged. Finally, full-scale pilot testing will take place at the two pilot sites in Newcastle and Colle Capretto. The technical data (system performance) gathering is the main focus of the lab tests, thus relevant parameters will be tested here, rather than in the on-site tests in Newcastle and Colle Capretto Recording Tools All technical metrics will be recorded using system log files. These files will have a data structure appropriate to the task to be performed by each module or system component, but as noted, it will be important for time stamps to be used to help mark the start point of an emergency event and also to identify the root of any errors or malfunctions within the system. System log files (from all components) will be recorded and archived to allow for post-event review and analysis. This accumulation of data into a database may have future benefits by providing additional input into the DSS for future emergency events. Based upon the UCs defined in WP1 (D1.1), the following technical metrics have been defined and, where appropriate, will be used to evaluate the SAVE ME system:

Page 30: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 29 UNEW

Metric Criterion Quantifiable Indicator Threshold Indicator (Units) Means of

Measurement Success Criteria

Reliability

User Positioning and Localisation

Number of users correctly located by system

Accuracy: <= 5m (metro)

<= 10m (tunnel) User Position (metres) Log file

90% of users located within threshold

System Connectivity

Full signal transmission to ops. centre in real time

100% signal power; <3 sec to establish

comms.

Signal availability (%) Transmission time (sec)

Log file Signal established within

time threshold

Coverage Amount of the infrastructure’s

area covered by the sensor and detection system

>= 95% Proportion of public area floor space monitored by

system

Manual analysis using DSS Map

95% of public floor area (in test situations) covered by system

Information Accuracy

Degree of correctness of evacuation information

provided >= 90% -

Log files or

Questionnaires

Correct information provided to users

System Performance

Latency Time between the detection of

the event and the message transmitted to the DSS

<= 60 seconds

Time to receive message at DSS after

event detected (seconds)

Log File 95% of emergency

events detected at DSS within 60 seconds

True Detection of Emergency Event

System correctly identifies an event occurrence and notifies

the DSS

At least 90% TP, 95% or 99% for specific

situations (see D1.1) True Positives (%) Log Files

90% of TP events correctly identified

False Alarm of Emergency Event

System incorrectly identifies an event occurrence but still

proceeds to notify the DSS FP <= 1% False Positives (%) Log Files

1% of FP occurrences actually passed to DSS

Person Detection System identifies the presence

of a person within the infrastructure

TP >= 80% True Positives (%) Log Files At least 80% of users in

the infrastructure identified

System Autonomy

Availability of the SAVE ME system in the event of an

emergency 30 minutes

Energy autonomy (minutes)

Log Files

System can maintain

itself under own power for up to half an hour

Legibility

Signs and audio signals for non-mobile users are legible

from a distance

<= 20m Sign visibility/signal audibility (metres)

Visual Inspection or

Questionnaires

Users can clearly see signs at given distances

Page 31: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 30 UNEW

Metric Criterion Quantifiable Indicator Threshold Indicator (Units) Means of

Measurement Success Criteria

Application Performance

Reliability of Connection

Service available during an emergency

>= 95% System availability (sec or % of event duration)

Log Files System can maintain

itself during an emergency

Guidance Accuracy

Emergency evacuation programme can guide people

to nearest exit <= 3m User position (metres) Log Files

At least 95% of users are able to exit by the

quickest route

Information Accuracy

Information provided in emergency evacuation

programme gives correct guidance

>= 95% - Log Files At least 95% of users are

able to exit by the quickest route

Table 6: SAVE ME Technical Metrics

Page 32: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 31 UNEW

5.2. User Acceptance (Non-Technical Metrics) Along with the technical evaluation of the SAVE ME system and how it performs, it will also be important to provide a means of assessing and analysing users’ attitudes and acceptance towards the system. There is little use in having a system which performs well, only for it not to deliver services the user actually wants or needs. These user acceptance metrics need to be defined in order to enable the measurement of the usefulness and desirability of the overall SAVE ME system. Indicators for the SAVE ME non-technical evaluation will therefore be classified into 2 different types: Functionality and Usability. Functionality: A system’s functionality can be defined by assessing the services it delivers, comparing these to the features/function/services that the user(s) actually desires. Functionality must not be confused or combined with usability (see below) which is associated with measuring the degree to which individual functions meet the user needs. The two are interrelated, however, as additional functionality can have an impact upon the overall usability of a system or product. Usability: The usability of the SAVE ME system can be defined using the ISO 9241-11 definition of usability i.e. “The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use”. Functionality Metrics The term ‘functionality’ is quite generic in its scope and could therefore be interpreted in a number of different ways. However, the primary aim of assessing a system’s functionality needs to identify whether the user perceives the actual functions provided by the system to be useful, and if the functions satisfy the user’s needs or expectations, by providing features which are deemed desirable or are needed by the user. It will be important to determine whether there are additional functions which future iterations of the system could introduce to improve user satisfaction. To assess the functionality of the SAVE ME system, it is proposed to adopt the Van Der Laan Acceptance (VDLA) scale, originally designed to measure drivers’ acceptance of new vehicle technologies, and therefore an apt scale to apply for the new SAVE ME technologies. Internal validity of the VDLA is robust; analysis of the responses from six separate tests which applied the VDLA returned α <= 0.73 (Van Der Laan, Heino and de Waard, 1997). The VDLA scale comprises of nine five-point Likert scale items, which load onto two separate scales; one for measuring ‘usefulness’ of a system (odd numbered items) and the other measuring ‘satisfaction’ with a system (even numbered items):

1 Useful Useless

2 Pleasant Unpleasant

3 Bad Good

4 Nice Annoying

5 Effective Superfluous

6 Irritating Likeable

7 Assisting Worthless

8 Undesirable Desirable

9 Raising

Alertness

Sleep- Inducing

Page 33: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 32 UNEW

The VDLA returns a usefulness score ranging from -2 to +2 (step 0.2) based upon the appropriate Likert scale responses. The original SAVE ME technical annex indicated a user acceptance score threshold of 7 out of 10 (on a 0-10 scale), and so to align the two, it is proposed to adopt a usefulness score threshold of >= 0.8 which is equivalent to 70% (i.e. 7/10) along the scale. The VDLA also returns a satisfaction score, again ranging from -2 to +2 (step 0.25) based upon the appropriate Likert scale responses. The original SAVE ME technical annex indicated a user acceptance threshold of 7 out of 10 (on a 0-10 scale), and so to align the two, it is proposed to adopt a satisfaction score threshold of >= 0.75 which is equivalent to 68.75% (i.e. 6.875/10, the nearest value to 7/10) along the scale. In addition to the VDLA, the assessment of the functionality should give consideration to the potential for integrating the current system’s functions with that of existing systems; the maturity of the current system (i.e. how easy/feasible will it be for the system to become an ‘off-the-shelf’ product); whether the complete set of functions offered by the system are subject to competition from other systems; and if there are any legal issues that need to be addressed due to any missing or incomplete functions. All of these can be addressed either quantitatively (e.g. “On a scale of 1-10, please rate the usefulness of the (specific) functions provided by the system”) or qualitatively (e.g. “What functions of the system did you like?” “What functions do you feel are missing from the system?” “Do you feel this function is well-designed?” etc.) Adopting the former category of closed, scale-based questioning allows for different functions within a system to be given a ‘rating’ which can be analysed, typically for the purpose of comparison between functions (or different systems). The use of open, descriptive questions in the latter line of questioning allows for users to express how they perceived the system to be, in greater detail. To fully assess the functionality of the SAVE ME system, it is proposed to use both forms of questioning. Usability Metrics Despite international standard definitions, Usability as a specific concept can be difficult to define. Therefore, it is typically measured through a combination of metrics concerning ‘effectiveness’, ‘efficiency’ and ‘satisfaction’, where:

Effectiveness: Can users achieve what they want by using the system?

Efficiency: How much resources (for example time) are needed to use the system?

Satisfaction: What do the users think of their interaction with the system? There are now a number of recognised methodologies for the evaluation of usability, although most of these are primarily for software and ICT-based products. SUMI: The Software Usability Measurement Inventory is a rigorously tested and proven method of measuring software quality from the end user's point of view. QUIS: The Questionnaire for User Interaction Satisfaction (QUIS) is a tool developed by a multi-disciplinary team of researchers in the Human-Computer Interaction Lab (HCIL) at the University of Maryland at College Park. QUIS is designed to assess

Page 34: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 33 UNEW

users' subjective satisfaction with specific aspects of the human-computer interface. The QUIS team have successfully addressed the reliability and validity problems found in other satisfaction measures, creating a measure that is highly reliable across many types of interfaces. CIF: The ANSI/INCITS-354 Common Industry Format (CIF) for Usability Test Reports is a standard method for reporting usability test findings. The American National Standards Institute (ANSI) approved the CIF December 12, 2001 as the standard for reporting usability test results. The purpose of the CIF is to encourage incorporation of usability as an element in decision making for software procurement. Whilst the above methodologies are primarily for software applications, they are relatively expensive to adopt (subscriptions are required). To overcome this need, one widely used and accepted scale for usability is the freely available System Usability Scale. SUS: The System Usability Scale (SUS) is a simple, ten-item attitude Likert scale (1 = Strongly Disagree, 5 = Strongly Agree) giving a global view of subjective assessments of usability. It was originally developed by John Brooke at Digital Equipment Corporation in the UK in 1986 as a tool to be used in usability engineering of electronic office systems. Since then, the SUS has since been applied to many usability measurements and undergone some slight adaptations, but the original structure remains the same. Results from the analysis of a large number of SUS applications show it to be a highly robust and versatile tool for usability, returning a validity of α = 0.91 (Bangor, Kortum and Miller, 2009). The following list gives the items used in the version of the SUS applied by Bangor, Kortum and Miller (2009):

1. I think that I would like to use this product frequently 2. I found the product unnecessarily complex 3. I thought the product was easy to use 4. I think that I would need the support of a technical person to be able to use

this product 5. I found the various functions in the product were well integrated 6. I thought there was too much inconsistency in this product 7. I imagine that most people would learn to use this product very quickly 8. I found the product very awkward to use 9. I felt very confident using the product 10. I needed to learn a lot of things before I could get going with this product

Bangor, Kortum and Miller also include an additional seven-step adjective Likert rating scale after the ten original Likert items

11. Overall, I would rate the user-friendliness of this product as Worst Imaginable > Awful > Poor > OK > Good > Excellent > Best Imaginable

Analysis has shown that responses from the new Likert item correlate well with the existing SUS scores (r = 0.822). The SUS returns a usability score from 0-100 based upon the various Likert scale responses. The original SAVE ME technical annex indicated a usability score threshold of 7 out of 10 (on a 0-10 scale), and so to align the two, it is proposed to adopt a usability score threshold of >= 70/100 i.e. 7/10.

Page 35: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 34 UNEW

In addition to the SUS to assess usability, further non-technical metrics pertaining to automation, mental workloads, situation awareness and trust in the system will also be evaluated using the questionnaires developed as an output of the SHAPE (Solutions for Human Automation Partnerships in European Air Traffic Management) project (Dehn, 2008). Although originally designed for assessing the implications of automation in Air Traffic Management and Control Centres, the questionnaires developed by the SHAPE project are equally applicable for the assessment of other automated systems which support professional operators in their day-to-day activities, such as those developed for SAVE ME. One significant benefit of using the SHAPE suite of questionnaires is that the original set of questionnaires have undergone a series of iterative improvements to their design, including an empirical study with a sample of air traffic controllers, intended to increase their efficiency (e.g. time taken to complete) and user comprehension (e.g. construct and wording of questions). The final revised suite of questionnaires (which are freely available) from the SHAPE project are as follows: Assessing the Impact of Automation on Mental Workload (AIM) Automation usually aims to reduce workload, by assigning tasks to the machine that were previously carried out by the human. However, automation can also yield new demands, in particular with respect to cognitive and perceptual activity. Therefore, it is important to ensure that new automation does not increase controllers’ workload. (EUROCONTROL, 2008) For the AIM, there is a short version (AIM-s) and a long version (AIM-l). The AIM-l consists of eight subtests with four items each. The subtests are:

1. Building and Maintaining Situation Awareness 2. Monitoring of Information Sources 3. Memory Management 4. Managing the Controller Working Position 5. Diagnosing and Problem Detection 6. Decision Making and Problem Solving 7. Resource Management and Multi-Tasking 8. Team Awareness

The AIM-s consists of 16 items which are not divided into subtests. The overall consistency is high for both versions of the AIM (α=0.97 for the long version, and α=0.95 for the short version). The subtests in the AIM-l show satisfactory internal consistencies ranging from α=0.68 to α=0.86, with the majority of α>0.80. The AIM questionnaires return an overall score from 0-6 based upon the Likert scale responses. The original SAVE ME technical annex indicated a usability score threshold of 7 out of 10 (on a 0-10 scale), and so to align the two, it is proposed to adopt an AIM score threshold of >= 4.

Page 36: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 35 UNEW

SHAPE Teamwork Questionnaire (STQ) The use of new automation can change teamwork and group interaction. This refers, among other things, to the allocation of tasks between team members and the way information is exchanged between team members. (EUROCONTROL, 2008) As in the AIM questionnaire, there is a short version (STQ-s) and a long version (STQ-l) of the STQ. The STQ-l comprises six subtests with four items each (i.e., a total of 24 items). The six subtests used in the STQ-l are:

1. Team Situational Awareness 2. Team Roles & Responsibilities 3. Team Co-operation 4. Team Climate 5. Team Error Management 6. Team Communication

The STQ-s consists of 12 items which are not further divided into sub-tests. The overall consistency has been shown as sufficient for both versions of the STQ (α=0.88 for the long version, and α=0.76 for the short version). The subtests in the STQ-l show satisfactory internal consistencies ranging from α=0.68 to α=0.87, with only one sub-test returning an α<0.70. The STQ questionnaires return an overall score from 0-6 based upon the Likert scale responses. The original SAVE ME technical annex indicated a usability score threshold of 7 out of 10 (on a 0-10 scale), and so to align the two, it is proposed to adopt an STQ score threshold of >= 4. Situational Awareness for SHAPE (SASHA) By changing the allocation of tasks between the human and the machine, automation is likely to have an impact on a controller’s Situational Awareness. It is therefore important to ensure that an automated tool does not impair a controller’s Situational Awareness. (EUROCONTROL, 2008) There is only one version of SASHA, which has six items. The test shows a high consistency, returning α=0.86. Due to its conciseness, the SHAPE team suggest that the revised SASHA should be mainly used for screening purposes. In order to obtain a more detailed insight in the way the new system might change the controller’s understanding of the task environment, it is recommended to conduct an additional interview. The SASHA questionnaires return an overall score from 0-6 based upon the Likert scale responses. The original SAVE ME technical annex indicated a usability score threshold of 7 out of 10 (on a 0-10 scale), and so to align the two, it is proposed to adopt an STQ score threshold of >= 4. SHAPE Automation Trust Index (SATI) The use of automated tools will depend on the controllers' trust. Trust is a result of many factors such as reliability of the system and transparency of the functions. For competent use of a tool, neither mistrust nor over-trust are desirable. (EUROCONTROL, 2008) The revised SATI consists of six items which are not arranged into different sub-tests. The internal consistency of the new SATI is α=0.83. Again, due to its conciseness, the new SATI should be mainly used for screening purposes, with follow up questionnaires to obtain a more detailed insight. The SATI questionnaires

Page 37: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 36 UNEW

return an overall score from 0-6 based upon the Likert scale responses. The original SAVE ME technical annex indicated a usability score threshold of 7 out of 10 (on a 0-10 scale), and so to align the two, it is proposed to adopt an SATI score threshold of >= 4. All four of the SHAPE questionnaires described above will be used in the evaluation of the non-technical metrics for SAVE ME. Where required, the wording of the original SHAPE questions will be amended to reflect the needs of the SAVE ME systems. Data Acquisition Methods The following non-technical aspects will be included in the questionnaires within the CEF:

Non-Technical Aspect Indicator

User acceptance

Usability of the SAVE ME system by

members of the public

Usability of SAVE ME DSS in the

operations of the institution/company

Importance of DSS features to future

activities (e.g. how many future services

may use DSS)

System Maturity

Maturity of SAVE ME system (e.g. ready

for use, needs further testing, needs

further development, etc.)

Estimated number of changes required to

integrate DSS into user’s existing

systems (many/several/a few)

Completeness of System

Estimated number of features which are

lacking in the current DSS

List of missing or desired DSS features

Potential Market and Competition Comparison with existing similar DSS (if

any exist)

Legal

List of legal issues which may be an

obstacle to implement DSS e.g. existing

patent violations

Potential change of legal requirements

Table 7: SAVE ME Non-Technical Aspects

Page 38: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 37 UNEW

A number of options are available for the recording of data relating to non-technical metrics. These including Focus Groups, Stated Preference surveys, Revealed Preference surveys, test participation before and after comparisons, opinion surveys with the general public, expert observation of experimental activities, etc. Questionnaires and Focus Groups have already been applied in WP1 to help define and understand the User Needs in emergency situations. Here, the primary data acquisition tool will take the form of a series of user evaluation questionnaire, which will be developed to gather the responses of the various target user groups (adults, vulnerable users, children, unfamiliar users, operators and rescue personnel) who participated in the final SAVE ME pilot testing. This questionnaire will incorporate the System Usability Scale to help measure specific usability of the systems, as well as the SHAPE questionnaires. Based upon the UCs defined in WP1 (D1.1), the following non-technical metrics have been defined and, where appropriate, will be used to evaluate the SAVE ME system:

Page 39: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 38 UNEW

Metric Criterion Quantifiable Indicator Threshold Indicator (Units) Means of

Measurement Success Criteria

Functionality

Operator Usefulness

Operators find the functions provided by SAVE ME useful

>= 0.8 on a -2 to +2 scale

(step 0.2)

Van Der Laan acceptance scale (usefulness)

Van Der Laan Questionnaire

Usefulness rated at least 70%

Traveller Usefulness

Travellers find the functions provided by SAVE ME useful

>= 0.8 on a -2 to +2 scale

(step 0.2)

Van Der Laan acceptance scale

(usefulness)

Van Der Laan Questionnaire

Usefulness rated at least 70%

Rescuer Usefulness

Rescuers find the functions provided by SAVE ME useful

>= 0.8 on a -2 to +2 scale

(step 0.2)

Van Der Laan acceptance scale

(usefulness)

Van Der Laan Questionnaire

Usefulness rated at least 70%

Operator Satisfaction

Operators are satisfied with the SAVE ME functions

>= 0.75 on a

-2 to +2 scale

(step 0.25)

Van Der Laan acceptance scale

(satisfaction)

Van Der Laan Questionnaire

Satisfaction rating of at least 70%

Traveller Satisfaction

Travellers are satisfied with the SAVE ME functions

>= 0.75 on a

-2 to +2 scale

(step 0.25)

Van Der Laan acceptance scale

(satisfaction)

Van Der Laan Questionnaire

Satisfaction rating of at least 70%

Rescuer Satisfaction

Rescuers are satisfied with the SAVE ME functions

>= 0.75 on a

-2 to +2 scale

(step 0.25)

Van Der Laan acceptance scale

(satisfaction)

Van Der Laan Questionnaire

Satisfaction rating of at least 70%

Page 40: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 39 UNEW

Metric Criterion Quantifiable Indicator Threshold Indicator (Units) Means of

Measurement Success Criteria

Usability

System Administration

Users can log-on, create profiles on the system

70 out of 100

SUS Scale (usability units) SUS in User

Questionnaire Usability >= 70/100

Operator Support

System provides operators with appropriate information on the type of

incident, area affected,

70 out of 100

SUS Scale (usability units) SUS in User

Questionnaire Usability >= 70/100

Travellers Travellers are able to use the system

to evacuate correctly 70 out of

100 SUS Scale (usability units)

SUS in User Questionnaire

Usability >= 70/100

Rescue units Rescuers are able to use the system

to reach trapped travellers and rescue them

70 out of 100

SUS Scale (usability units) SUS in User

Questionnaire Usability >= 70/100

Situation awareness

Operator Operators are able to fully

comprehend the current situation

>= 4 on a 0-6 SASHA

Scale SASHA scale

Situational Awareness for SHAPE (SASHA)

questionnaire - EUROCONTROL

Awareness score of at least 70% (4/6)

Rescue unit Rescuers are able to comprehend the

current situation

>= 4 on a 0-6 SASHA

Scale SASHA scale

Situational Awareness for SHAPE (SASHA)

questionnaire - EUROCONTROL

Awareness score of at least 70% (4/6)

Workload

Operator Operators’ workload is not affected by

the system

>= 4 on a 0-6 AIM Scale

AIM scale

Impact of Automation on Mental Workload

(AIM-s) – EUROCONTROL

Workload score of at least 70% (4/6)

Rescue units Rescuers are able to work in a team

using the system

>= 4 on a 0-6 STQ

Scale STQ scale

SHAPE Teamwork Questionnaire (STQ-s) – EUROCONTROL

Teamwork score of at least 70% (4/6)

Trust

Operators Operators trust the system to act

correctly

>= 4 on a 0-6 SATI

Scale SATI scale

SHAPE Automation Trust Index (SATI) –

EUROCONTROL

Trust Index of at least 70% (4/6)

Travellers Travellers trust the system to guide/

evacuate them correctly

>= 4 on a 0-6 SATI

Scale SATI scale

SHAPE Automation Trust Index (SATI) –

EUROCONTROL

Trust Index of at least 70% (4/6)

Rescue units Rescuers trust the system to support

them in a rescue operation

>= 4 on a 0-6 SATI

Scale SATI scale

SHAPE Automation Trust Index (SATI) - EUROCONTROL

Trust Index of at least 70% (4/6)

Table 8: SAVE ME Non-Technical Metrics

Page 41: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 40 UNEW

5.3. Economic Evaluation In addition to the technical and non-technical user metrics, one further area of evaluative analysis of the SAVE ME system will be concerned with the costs and benefit of implementing the system. As part of WP9, an a priori Cost Benefit Analysis (CBA) will be carried out on the main SAVE ME exploitable results utilising draft cost data supplied by the system developers. In addition to this, a Cost Effectiveness Analysis (CEA) will also be performed using an Analytical Hierarchy Process methodology. Both the CBA and CEA will be repeated a posteriori based upon actual cost data from the system developers. As part of WP8 and the Pilot Plans, information will be gathered on stakeholders’ Willingness to Pay (WTP) and Willingness to Have (WTH) to assess the market viability of the SAVE ME system. This will be especially critical for system operators who are the primary purchasers of the overall SAVE ME system, and therefore it is proposed to use Van Westendorp's Price Sensitivity Meter (VWPSM). The VWPSM works on the principle that there is a relationship between the price and the quality of a product, where people would be more willing to pay a higher price for a better quality product. VMPSM asks four simple questions:

1. At what price would you consider the product so high so that you would not purchase it? (Too Expensive) 2. At what price would you consider the product on the high side but you would still purchase it? (Expensive) 3. At what price would you consider the product a bargain, so you would purchase it? (Inexpensive) 4. At what price would you consider the product so low you would question its quality and would not purchase it? (Too Cheap)

Cumulative percentages of the answers to questions 1 and 2, and the inverse of the cumulative percentages of the answers to questions 3 and 4, are plotted on a single graph (Figure 8). By analysing where the lines for different combinations of questions intersect, key breakpoints in the price range can be identified. 1. The PERCEIVED NORMAL PRICE – where equal numbers of people consider the offering inexpensive vs. expensive. 2. The PENETRATION PRICE – the price which maximizes the number of people who would consider the offering – that is, the price at which the fewest people would consider the offering either too expensive or too cheap. 3. The HIGHEST REASONABLE PRICE – the price at which equal numbers of people consider the offering too expensive vs. "not expensive". At any higher price, decreasing volume overcomes increasing revenue. 4. The LOWEST REASONABLE PRICE – the price at which equal numbers of people consider the offering “too cheap” and "not cheap". At any lower price, decreasing revenue overcomes potential volume increases.

Page 42: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 41 UNEW

5. The difference between the LOWEST REASONABLE PRICE and the HIGHEST REASONABLE PRICE is considered the RANGE OF PRICING OPTIONS. Unlike other methods used to establish pricing structures, VWPSM begins by not suggesting a price to the respondents. All other pricing techniques present a product to the respondent and ask for their reaction to a proposed price (Farace, 2008).

Figure 8: Example of Graph Plot for Van Westendorp’s Price Sensitivity Meter (Farace, 2008)

Page 43: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 42 UNEW

6. Laboratory Pilot Tests

SAVE ME relies upon a complex WSN supporting the DSS, therefore pre-pilot testing will be implemented in order to identify any significant errors and problems in the modules, software and/or hardware, before the actual on-site pilot tests are undertaken. This will also allow for user and stakeholder pre-assessment of the relative components of the SAVE ME system, essentially a ‘debugging’ procedure. It will not be necessary to involve large numbers of users, as dictated by the general concept of usability testing. The following table presents the various laboratory pilot tests which will be performed, the partner responsible for the testing and evaluation, the estimated number of users and scenarios, and the timeframe in which the tests are to be carried out:

System to be tested Partner

Responsible Estimated number of

users / Scenarios Success Criteria

Realisation period

Iterative laboratory-based user testing for design of

HMI elements of A3.2 USTUTT

6 operators, 10 rescue team members

and 20 travellers

At least 3 new icons and 2

new earcons specified

M16-17

Enhanced simulation model of A5.1

SIMUDYNE 2 test scenarios per

site, each involving 10 travellers.

At least 20% improvement in reliability over non-adapted models

M18-19

Infrastructure operator support system of A6.1

GST 6 operators, 10

rescue team members and 20 travellers

Mean usability

score >7 on scale of 0-10

M22-23

Rescuers support system of A6.2

CNVVF 10 rescue team

members

Mean usability

score >7 on scale of 0-10

M22-23

Individuals guidance and support system of A6.3

CERTH/HIT 20 travellers

Mean usability

score >7 on scale of 0-10

M22-23

Collective herding guidance of A6.4

UNEW (DI) 20 travellers

Mean usability

score >7 on scale of 0-10

M20-21

Operators’ training of A7.1

CERTH/ITI 6 operators

Mean usability

score >7 on scale of 0-10

M22-23

Rescue support team training of A7.2

IES 10 rescue team

members

Mean usability

score >7 on scale of 0-10

M22-23

General public training of A7.3

COAT 20 travellers

Mean usability

score >7 on scale of 0-10

M22-23

Table 9: SAVE ME Laboratory Technical Testing Each of these laboratory-based tests must conform to the CEF guidelines set out in this document.

Page 44: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 43 UNEW

7. Test Procedures, Scenarios and Roadmap for the Pilot Plans

To demonstrate all the components of the SAVE ME Disaster Detection and Management System (DDMS), plus the Decision Support System (DSS), the following scenarios are proposed. These have been narrowed down and specified in greater detail as the technical development progresses, to ensure that the final pilot tests are feasible and will return the right form of data to allow for a robust analysis of the whole SAVE ME system.

7.1. Pilot Test Administration and Set-up The objectives of the Pilot Tests will be:

Assess the effectiveness and efficiency of the DDMS in different scenarios and condition of work, especially with regard to the DSS

Address essential UCs in each pilot site and raise peculiar issues for each location

Assess how the system action is perceived by each category of users (i.e. travellers, rescuers and operators)

To ensure the SAVE ME systems are tested in a fair yet robust manner, there will be more than one run conducted per pilot site, starting with a simple/low level scenario before adding complexities into the subsequent scenario(s). Having separate runs will allow for the intelligence and flexibility of the DSS to be tested and will enable pre- and post-system implementation to be assessed. The SAVE ME pilot tests will be conducted over a period of about one week at each pilot site during M32 to M33. This will allow the equipment to be installed, calibrated and tested prior to the tasks being carried out, and then allow for uninstalling after the tests. Regarding the training, some users will be given training of the guidance systems, other will not, and comparisons can then be made to assess the effectiveness of the training curricula provided. Each user will have a log-file of their actions recorded for monitoring and evaluation purposes. Safety experts will also be recruited at each test site to monitor, assess and evaluate the individual user profiles and behaviours, to help better understand the influence the SAVE ME system has on travellers. In addition to these experts, technical support from relevant SAVE ME partners responsible for the development and implementation of the WP4 systems (UNEW, MIZAR, CERTH/ITI and UPM) will also be required.

Page 45: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 44 UNEW

The following table outlines the various features which will be encompassed in the Pilot Tests, with any site-specific features noted in the appropriate column.

Feature Common

events/features Newcastle Specific C.Capretto Specific

System administration

Check the system status

Emergency detection

Fire and smoke (Terrorist attack)

Smoke/Fire Gas leak

Traffic accident

Event detection Emergency person

detection Person counting

Person counting Vehicles counting

Localisation Vehicles and people

localisation People localisation People in a vehicle

Decision support system

Routing for crowd evacuation

Personalized routing for travellers and rescue teams

Reconfiguration of network

Operator support

Info. of the type of emergency event on the affective area

Next steps/suggested procedure

Operator is presented with an overview of the emergency

Communication with the emergency team

Traveller support

Establish communication with the crowd

Provide collective and personalized info on emergency and evacuation guidance

Rescue unit support

Establish communication with and among rescue units

Provide priority guidance to individual travellers trapped in the area

Send alert to emergency centre

Table 10: Features to be included in the SAVE ME Pilot Testing

Page 46: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 45 UNEW

7.2. Newcastle Scenarios The following tables outline the proposed scenarios which will be implemented at Monument Metro Station. These scenarios have been developed in consultation with the technical partners in WP4 and representatives of the Metro Station, who have indicated that litter bin bags located at various points along the platforms could be a likely source of a fire. However in the pilot scenarios, all emergency conditions will be simulated within the DSS.

RUN 1 - Simple Scenario Start status Normal operating conditions

Events (trigger-where-consequences on structures and people)

A) One of the platform bin bags is set alight (e.g. cigarette stub or match) causing a minor fire to begin B) Smoke spreads across the platforms and through the rest of the station C1) The passengers waiting on the platform begin to evacuate through the usual routes C2) Those unable to use the escalator or stairs are directed to gather at a safe refuge point to await rescue D) Fire is extinguished by emergency services E) Operations return to normal

Information flow

{DMMS – operator} detect an emergency, provide an overview of that to the operator

{DMMS – emergency centre} provide a warning message

{Emergency centre – rescue teams} provide overview information about the emergency

{DMMS – travellers} activate auditory evacuation instructions and signs for

collective herding; deliver personalized guidance for evacuation to each travellers on

his/her mobile phone

Table 11: Monument Metro Station Scenario Run 1

RUN 2 - Medium Scenario Start status Normal operating conditions

Events (trigger-where-consequences on structures and people)

A) Building works at the top of the interconnecting stairs accidentally breach a supply pipe B) A suspected gas leak starts to filter into the station environment C1) Leak detected by sensors and emergency alarm activated C2) Passengers waiting on the platform begin to evacuate through the usual routes D1) OpsUI takes action to block exit route via stairs (capacity = 0) and passes this to DSS D2) DSS recalculates exit routes and sends information to passengers D3) Those unable to use the escalator or stairs are directed to gather at a safe refuge point to await rescue

Page 47: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 46 UNEW

D4) Rescuers arrive to evacuate those at refuge point and close off supply tap E) Station cleared

Information flow

{DMMS – operator} detect an emergency, provide an overview of that to the operator

{DMMS – emergency centre} provide a warning message

{Emergency centre – rescue teams} provide overview information about the emergency

{DMMS – travellers} activate auditory evacuation instructions and signs for

collective herding; deliver personalized guidance for evacuation to each travellers on

his/her mobile phone

Table 12: Monument Metro Station Scenario Run 2

RUN 3 - Advanced Scenario Start status Normal operating conditions

Events (trigger-where-consequences on structures and people)

A) A suspect package is discovered in the middle of one of the platforms B) Operators are alerted and standard evacuation procedures are started using the OpsUI to set relevant zone capacities to 0 C) A second suspect package is then discovered at the foot of the escalator, making this exit route unavailable D1) OpsUI takes action to set additional zone capacities to 0 and alerts emergency services about new situation D2) DSS recalculates exit routes and sends information to passengers to exit via stairs only, using other platform where necessary D3) Those unable to use the escalator or stairs are directed to gather at a safe refuge point to await rescue E) Rescuers arrive to evacuate those at refuge point F) Emergency services arrive to deal with suspect packages G) Situation given all clear, operations return to normal

Information flow

{DMMS – operator} detect an emergency, provide an overview of that to the operator

{DMMS – emergency centre} provide a warning message

{emergency centre – rescue teams} provide overview information about the emergency

{DMMS – travellers} activate auditory evacuation instructions and signs for

collective herding; deliver personalized guidance for evacuation to each travellers on

his/her mobile phone {DMMS – rescue teams} provide real time

information about the emergency and priority guidance to trapped passengers location

Table 13: Monument Metro Station Scenario Run 3

Page 48: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 47 UNEW

7.3. Colle Capretto Scenarios The following tables outline the proposed scenarios which will be implemented at the Colle Capretto Tunnel. These scenarios will be developed in consultation with the technical partners in WP4 and representatives of the tunnel.

RUN 1 - Simple Scenario Start status Normal operating conditions

Events (trigger-where-consequences on structures and people)

The driver of a car (A) feels faint and skids, an incoming car (B) crashes against the car (A). Drivers and passengers of the two involved cars get minor injuries (yellow code). The car (B) leaks fuel, which sparks and catches fire. The drivers attempt to extinguish the fire with the fire extinguisher to slow down the fire spread without blocking it. The air is filling with smoke and dust.

Information flow

{DMMS – operator} detect an emergency, provide an overview of that to the operator

{DMMS – emergency centre} provide a warning message

{emergency centre – rescue teams} provide overview information about the emergency

Table 14: Colle Capretto Tunnel Scenario Run 1

RUN 2 - Medium Scenario Start status Normal operating conditions

Events (trigger-where-consequences on structures and people)

The driver of a car (A) feels faint and skids, an incoming car (B) crashes against the car (A), bumping and generating a multiple pile-up which involves a further car (C) and the school bus (D), which trying to avoid the crash, skids and gets sideways, blocking the carriageway. The driver of the car (A) is trapped and seriously injured (red code). Drivers and passengers of the other two involved cars get minor injuries (yellow code), the driver of the school bus as well as all its passengers escape unharmed (green code). The car (C) leaks fuel, which sparks and catches fire. The bus driver’ attempts to extinguish the fire with the fire extinguisher slow down the fire spread without blocking it. All the uninjured passengers - elderly and children included - evacuate the bus and vehicles, whilst the smoke spreads. All bus passengers are foreign and are not familiar with the languages used in the tunnel signs and warning messages.

Information flow

{DMMS – operator} detect an emergency, provide an overview of that to the operator {DMMS – emergency centre} provide a warning message {emergency centre – rescue teams} provide overview information about the emergency {DMMS – rescue teams} provide real time information about the emergency and priority guidance to trapped passengers location

Page 49: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 48 UNEW

Table 15: Colle Capretto Tunnel Scenario Run 2

RUN 3 - Advanced Scenario Start status Normal operating conditions

Events (trigger-where-consequences on structures and people)

The driver of a car (A) feels faint and skids, an incoming car (B) crashes against the car (A), bumping and generating a multiple pile-up which involves a further car (C) and the school bus (D), which trying to avoid the crash, skids and gets sideways, blocking the carriageway. The driver of the car (A) is trapped and seriously injured (red code). Drivers and passengers of the other two involved cars get minor injuries (yellow code), the driver of the school bus as well as all its passengers escape unharmed (green code). The car (C) leaks fuel, which sparks and catches fire. The bus driver’ attempts to extinguish the fire with the fire extinguisher slow down the fire spread without blocking it. All the uninjured passengers - elderly and children included - evacuate the bus and vehicles, whilst the smoke spreads. All bus passengers are foreign and are not familiar with the languages used in the tunnel signs and warning messages. Thick smoke spreads in the tunnel and access to the nearest shelter is occluded by the damaged remains of the school bus.

Information flow

{DMMS – operator} detect an emergency, provide an overview of that to the operator {DMMS – emergency centre} provide a warning message {emergency centre – rescue teams} provide overview information about the emergency {DMMS – travellers} activate auditory evacuation instructions and signs for collective herding; deliver personalized guidance for evacuation to each travellers on his/her mobile phone {DMMS – rescue teams} provide real time information about the emergency and priority guidance to trapped passengers location

Table 16: Colle Capretto Tunnel Scenario Run 3

7.4. Gantt Chart/Roadmap To illustrate the progression of the work throughout WP8, the following Gantt Chart has been developed, including key milestones. This shows the key points in the development of the pilot plans, but also the key points at which the technical system is development and implemented.

Page 50: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 49 UNEW

Project Year 1

Task M1 M2 M3 M4 M5 M6 M7 M8 M9 M10 M11 M12

Oct 09 Nov 09 Dec 09 Jan 10 Feb 10 Mar 10 Apr 10 May 10 Jun 10 Jul 10 Aug 10 Sep 10

A8.1 First Draft of Pilot Plan Preparation Questionnaire Prep

A8.2

A8.3

A8.4

MS MS1 MS2

Project Year 2

Task M13 M14 M15 M16 M17 M18 M19 M20 M21 M22 M23 M24

Oct 10 Nov 10 Dec 10 Jan 11 Feb 11 Mar 11 Apr 11 May 11 Jun 11 Jul 11 Aug 11 Sep 11

A8.1 D8.1 1st

draft due D8.1 living document - edits and updates as required

A8.2 A3.2 HMI Tests A5.1 Simulation Model A6.4 Collective Herding A6.1 and A7.1/2/3 User Training

A8.3

A8.4

MS

Project Year 3

Task M25 M26 M27 M28 M29 M30 M31 M32 M33 M34 M35 M36

Oct 11 Nov 11 Dec 11 Jan 12 Feb 12 Mar 12 Apr 12 May 12 Jun 12 Jul 12 Aug 12 Sep 12

A8.1 Final D8.1

A8.2

A8.3 Recruitment of Users On-site Pilot Realisation (UK and IT)

A8.4 Consolidation of Lab Testing Consolidation of Pilot Testing D8.2 due

MS MS3 MS4

MS: WP8 Milestones MS1: Preparation of test results questionnaires MS2: First draft of pilot plans available (D8.1) MS3: Recruitment of pilot participants at both sites MS4: All pilot tests performed

Page 51: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 50 UNEW

8. Conclusion

In summary, this deliverable has presented the following:

Pilot sites of the SAVE ME project

Use Cases and Use Needs of the SAVE ME project

Use cases being tested in each of the pilot sites

Metrics used for the evaluation (technical and non-technical) of the SAVE ME systems, and the economic evaluation (in WP9)

Scenarios to be employed in the pilot testing

Specific pilot site details including resources used, breakdown of participant numbers by age etc.

List of resources available to each pilot site This deliverable has been constructed for the benefit of both the pilot site leaders and the application developers. The application developers know which Use Cases are being tested where, and therefore who they might need to contact in order to assure a smooth installation and operation of their application. The pilot site leaders can use this document to keep a track of their resources and infrastructure, matching their existing resources with that required of the applications. This therefore allows both the application developers and the pilot site leaders to acquire the required equipment and infrastructure in time for the live pilot trials. Continued liaison with representatives of the pilot sites will be undertaken, to ensure that the final proposed scenarios are achievable under their working terms and conditions, without causing disruption to their normal service activities. An alternative arrangement to find a dedicated time for the pilot tests may be required.

Page 52: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 51 UNEW

9. References

Bangor, A., Kortum, P. and Millar, J. (2009) “Determining What Individual SUS Scores Mean: Adding an Adjective Rating Scale”, Journal of Usability Studies, 4(3) pp.114-123 Dehn, D.M. (2008), “Assessing the Impact of Automation on the Air Traffic Controller: The SHAPE questionnaires”, Air Traffic Control Quarterly, 16(2), pp.127-146 EUROCONTROL (2008), SHAPE - Solutions for Human Automation Partnerships in European Air Traffic Management http://www.eurocontrol.int/humanfactors/public/standard_page/SHAPE.html EuroTest (2007), “EuroTAP - The Future of Tunnel Testing – Single Results”, http://www.eurotestmobility.net/eurotest.php?itemno=169&lang=EN#ColleCapretto Farace, V. (2008), “What Should We Charge? Setting Price”, http://www.satmansys.com/downloads/What%20Should%20We%20Charge%20-%20Setting%20Price.pdf Fawcett, T. (2006), “An introduction to ROC analysis”, Pattern Recognition Letters, 27, pp.861–874 Nexus (2009), “Nexus Business Intelligence Annual Report (A Year of Change)”, http://www.nexus.org.uk/sites/nexus.org.uk/files/documents/news/Business%20Intelligence%20Annual%20Report%20200809.pdf TEN-T EA (2011), “Upgrading the San Pellegrino Tunnel (SS n. 675 Umbro-Laziale) and the Colle Capretto Tunnel (SS n. 3bis Tiberina) on the E45 to the safety requirements for tunnels in the Trans-European Road Network” - Factsheet http://tentea.ec.europa.eu/download/project_fiches/italy/fichenew_2009it91408p_final_1_.pdf Van Der Laan, J.D., Heino, A. and de Waard, D. (1997), “A Simple Procedure for the Assessment of Acceptance of Advanced Transport Telematics”, Transportation Research Part C, 5(1), pp.1-10 Van Der Laan Acceptance scale - http://www.hfes-europe.org/accept/accept.htm SAVE ME Relevant Deliverables (when available) D1.1 SAVE ME Application Scenarios and Use Cases D2.1 System Architecture, Ontological Framework and Module Specifications D3.1 Guidance Plan, Based Upon Different Human Behaviour Factors D5.2 Decision Support System

Page 53: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 52 UNEW

Appendix A – Final Pilot Site Questionnaires

The following pages present the final questionnaires which will be used in the pilot trials. Whilst the general set of questions follows that prescribed in this Deliverable, there are some differences between the questionnaires for the different user groups. These questionnaires have been translated into Italian for the purpose of the Colle Capretto trials.

Page 54: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 53 UNEW

Front page to go with every questionnaire

Dear SAVE ME trial participant,

May we thank you for contributing your time and efforts to the SAVE ME user trials –

we are very grateful for your assistance.

The SAVE ME project has developed an intelligent sensor-based system which

detects both natural and man-made (i.e. terrorist attacks) disaster events in public

transport terminals, vehicles and critical infrastructures to support quick and optimal

mass evacuation.

The ultimate aim of SAVE ME is to provide support in emergency situations to help

save the lives of the general public and the rescuers, giving particular emphasis to

the most vulnerable travellers (i.e. children, older people and the mobility impaired).

Please note that these user trials are intended to test the technology developed in

the project and gather feedback on the potential of such a system in the future.

When answering these questions, please do keep in mind that SAVE ME is a

research project and as such is not intended to produce a fully-developed system

that would be ready to use in a real emergency situation.

We are interested in your views and opinions on what the technology shown in SAVE

ME could deliver in the future based upon your experiences with the system in these

trials today.

Many thanks once again,

The SAVE ME consortium

Page 55: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 54 UNEW

Adult Users

Section 1 - About you

Are you: Male Female

To what age group do you belong?

18 to 29 30 to 39 40 to 49 50 to 59

How frequently do you travel by Public Transport?

Public Transport Mode

Daily Weekly Monthly Yearly Never

Bus (Local Services)

Coach (Express Services)

Underground/ Subway/Metro

Railway (Overground)

Other (please specify)

Please tick ONE BOX only for each mode in the table

Do you use a mobile device as a regular means of communication? (e.g. Mobile Phone, PDA, iPhone, etc.)?

Yes No

Have you ever been involved in a transport emergency (of any type)?

Yes No

Have you received any training or instructions about how to behave when involved in

a transport emergency?

Yes (Please describe below) No

Page 56: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 55 UNEW

Section 2 - Opinions on the SAVE ME System

Please think back to the exercises in which you have just participated and the system

you have used or seen in action. Please rate the overall SAVE ME system by giving

one mark on the follow nine scales:

1 Useful Useless

2 Pleasant Unpleasant

3 Bad Good

4 Nice Annoying

5 Effective Superfluous

6 Irritating Likeable

7 Assisting Worthless

8 Undesirable Desirable

9 Exciting Boring

Please now think about how you felt when using the SAVE ME system for evacuation

purposes and mark one box on each line:

Strongly Disagree

Disagree Neutral Agree Strongly Agree

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in the system were well integrated

6. I thought there was too much inconsistency in this system

7. I imagine that most people would learn to use this system very quickly

8. I found the system very awkward to use

9. I felt very confident using the system

10. I needed to learn a lot of things before I could get going with this system

11. Overall, I would rate the user-friendliness of this system as

Worst Imaginable

Awful Poor OK Good Excellent Best

Imaginable

Page 57: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 56 UNEW

Please now complete the following table based upon your trust of the SAVE ME

system to help you in an emergency, where 0 = not at all and 6 = definitely. Again,

please only provide one mark per line.

I felt that:

Not at All

> > Maybe > > Definitely

0 1 2 3 4 5 6

a) The system was useful

b) The system was reliable

c) The system worked accurately

d) The system was

understandable

e) The system worked robustly

f) I was confident when using the

system

Based upon your experiences of travelling, in particular your involvement in any

previous transport emergencies, do you think the SAVE ME system could improve

the general evacuation procedure of travellers in emergency situations?

Yes No

Please state why you think this:

Do you think having the SAVE ME system would change the way you would behave

if faced with an emergency situation?

Yes No

Please state why you think this:

Page 58: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 57 UNEW

Section 3 - Future Acceptance of the SAVE ME System

Overall, on a scale of 0 (would not use) to 10 (would definitely use), what score

would you give the SAVE ME system in terms of its future acceptance by people like

yourself?

Wo

uld

No

t U

se

> > > >

Wo

uld

Use

> > > >

Wo

uld

De

finite

ly

Use

0 1 2 3 4 5 6 7 8 9 10

Please imagine that the SAVE ME application has been developed for mobile

devices. On a scale of 0 (definitely not) to 10 (definitely would), how willing would you

be to have this application on your mobile device?

De

finite

ly

No

t

> > > >

Po

ssib

ly

> > > >

De

finite

ly

Wo

uld

0 1 2 3 4 5 6 7 8 9 10

How would you like this application to be made available? (Please tick ONE option

only)

I am notified by the system that the application is available, and has automatically been downloaded/installed on my mobile device

I am notified by the system that the application is available, then decide to download/install it myself

I am not notified by the system that the application is available, but search for and download/install it myself

If you chose to download it yourself, would you be willing to pay for this application?

Yes No

If YES, how much would you be willing to pay?

£0.00 £0.50 £1.00 £1.50 £2.00 £2.50 £3.00 £3.50 £4.00 £4.50 £5.00+

€0.00 €0.50 €1.00 €1.50 €2.00 €2.50 €3.00 €3.50 €4.00 €4.50 €5.00+

Page 59: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 58 UNEW

Do you have any further comments about the system that have not been addressed

so far?

On behalf of the SAVE ME consortium, we would like to thank you for taking part in

the SAVE ME pilot tests and for taking the time to complete this questionnaire. All

answers and views provided here will be non-attributable and remain totally

anonymous. All results in the final reports will be summaries of the data collected.

Page 60: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 59 UNEW

Older People/Those with Impairments

Section 1 - About you

Are you: Male Female

To what age group do you belong?

18 to 29 30 to 39 40 to 49 50 to 59 60 and over

Do you have a medical condition or other personal circumstances that may cause

you difficulty in an emergency situation?

Yes No

How frequently do you travel by Public Transport?

Public Transport Mode

Daily Weekly Monthly Yearly Never

Bus (Local Services)

Coach (Express Services)

Underground/ Subway/Metro

Railway (Overground)

Other (please specify)

Please tick ONE BOX only for each mode in the table

Do you use a mobile device as a regular means of communication? (e.g. Mobile Phone, PDA, iPhone, etc.)?

Yes No

Have you ever been involved in a transport emergency (of any type)?

Yes No

Have you received any training or instructions about how to behave when involved in

a transport emergency?

Yes (Please describe below) No

Page 61: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 60 UNEW

Section 2 - Opinions on the SAVE ME System

Please think back to the exercises in which you have just participated and the system

you have used or seen in action. Please rate the overall SAVE ME system by giving

one mark on the follow nine scales:

1 Useful Useless

2 Pleasant Unpleasant

3 Bad Good

4 Nice Annoying

5 Effective Superfluous

6 Irritating Likeable

7 Assisting Worthless

8 Undesirable Desirable

9 Exciting Boring

Please now think about how you felt when using the SAVE ME system for evacuation

purposes and mark one box on each line:

Strongly Disagree

Disagree Neutral Agree Strongly Agree

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in the system were well integrated

6. I thought there was too much inconsistency in this system

7. I imagine that most people would learn to use this system very quickly

8. I found the system very awkward to use

9. I felt very confident using the system

10. I needed to learn a lot of things before I could get going with this system

11. Overall, I would rate the user-friendliness of this system as

Worst Imaginable

Awful Poor OK Good Excellent Best

Imaginable

Page 62: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 61 UNEW

Please now complete the following table based upon your trust of the SAVE ME

system to help you in an emergency, where 0 = not at all and 6 = definitely. Again,

please only provide one mark per line. I felt that:

Not at All

> > Maybe > > Definitely

0 1 2 3 4 5 6

a) The system was useful

b) The system was reliable

c) The system worked accurately

d) The system was

understandable

e) The system worked robustly

f) I was confident when using the

system

Based upon your experiences of travelling (especially your involvement in any

previous transport emergencies), do you think the SAVE ME system could improve

the evacuation procedure in emergency situations?

Yes No

Please state why you think this:

Do you think the SAVE ME system would change the way you would behave if faced

with an emergency situation?

Yes No

Please state why you think this:

Page 63: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 62 UNEW

Section 3 - Future Acceptance of the SAVE ME System

Overall, on a scale of 0 (would not use) to 10 (would definitely use), what score

would you give the SAVE ME system/device in terms of its future acceptance by

people like yourself?

Wo

uld

No

t U

se

> > > >

Wo

uld

Use

> > > >

Wo

uld

De

finite

ly

Use

0 1 2 3 4 5 6 7 8 9 10

Please imagine that the SAVE ME application has been developed for mobile

devices. On a scale of 0 (definitely not) to 10 (definitely would), how willing would you

be to have this application on your mobile device?

De

finite

ly

No

t

> > > >

Po

ssib

ly

> > > >

De

finite

ly

Wo

uld

0 1 2 3 4 5 6 7 8 9 10

How would you like this application to be made available? (Please tick ONE option

only)

I am notified by the system that the application is available, and has automatically been downloaded/installed on my mobile device

I am notified by the system that the application is available, then decide to download/install it myself

I am not notified by the system that the application is available, but search for and download/install it myself

If you chose to download it yourself, would you be willing to pay for this application?

Yes No

Page 64: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 63 UNEW

If YES, how much would you be willing to pay?

£0.00 £0.50 £1.00 £1.50 £2.00 £2.50 £3.00 £3.50 £4.00 £4.50 £5.00+

€0.00 €0.50 €1.00 €1.50 €2.00 €2.50 €3.00 €3.50 €4.00 €4.50 €5.00+

Do you have any further comments about the system that have not been addressed

so far?

On behalf of the SAVE ME consortium, we would like to thank you for taking part in

the SAVE ME pilot tests and for taking the time to complete this questionnaire. All

answers and views provided here will be non-attributable and remain totally

anonymous. All results in the final reports will be summaries of the data collected.

Page 65: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 64 UNEW

Child Users (Under 16)

Section 1 - About you

Are you: Male Female

How old are you?

Under 8 8, 9 or 10 10 or 11 12 or 13 14 or 15

Do you travel on Public Transport by yourself, or with your parents/guardians/carers?

Yes, I always travel alone

Yes, I sometimes travel alone

No, I always travel with my parents/guardian

No, I do not use Public Transport at all

Do you have a mobile device (e.g. Mobile Phone or an iPhone)?

Yes No

Have you ever been involved in a transport emergency (of any type)?

Yes No

Has an adult (such as your parents, a teacher, a fireman or a policeman) told you

what to do when involved in a transport emergency?

Yes No

Page 66: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 65 UNEW

Section 2 – Your feelings about the SAVE ME system

Please think back to the exercise you have just completed with everybody else.

Did you like using the SAVE ME system?

Yes No

Please now think about how you felt when using the SAVE ME system for evacuation

purposes and mark one box on each line:

Definitely

Not Not

Really Maybe Yes

Definitely Yes!

1. I think that I would like to use this system a lot

2. I did not understand how the system worked

3. I thought the system was easy to use

4. I think that I would need help to use this system

5. I found the different parts of the system worked well together

6. I thought there was too many different messages in this system

7. I think most people would learn to use this system very quickly

8. I found the system very hard to use

9. I felt very happy using the system

10. I needed to learn a lot of things before I could use this system

11. Overall, I would rate the user-friendliness of this system as

/

Page 67: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 66 UNEW

On a scale of 0 (which means you would not use the SAVE ME system), to 10 (which

means you definitely would use the SAVE ME system), what score would you give

the SAVE ME system? Please only tick one box in the table below. I W

ou

ld

No

t U

se

> > > >

I W

ou

ld

Use

> > > >

I W

ou

ld

De

finite

ly

Use

0 1 2 3 4 5 6 7 8 9 10

Do you think the SAVE ME system would change the way you would behave in an

emergency?

Yes No

Please now complete the following table based upon how you felt about using the

SAVE ME system to help you in an emergency. A score of 0 is when you were not

happy and a score of 6 is when you were very happy. If you cannot decide, please

tick the box in the middle (which is a score of 3). Please only provide one mark per

line.

Definitely Not

No Not

Really Maybe Perhaps Yes

Definitely Yes!

0 1 2 3 4 5 6

a) The system was useful to

me

b) The system worked very

well

c) The system told me the right

thing

d) I knew what the system was

telling me

e) The system did what I wanted it

to do

f) I was happy using the system

Page 68: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 67 UNEW

Section 3 – The SAVE ME System in the Future

In this section, please imagine that a SAVE ME application has been developed for

your mobile device. On a scale of 0 (definitely not) to 10 (definitely would), would you

like to install this application on your own mobile device?

De

finite

ly

No

t

> > > >

Po

ssib

ly

> > > >

De

finite

ly

Wo

uld

0 1 2 3 4 5 6 7 8 9 10

Do you have any further comments about the system that we have not asked you

about so far?

On behalf of the SAVE ME consortium, we would like to thank you for taking part in

the SAVE ME pilot tests and for taking the time to complete this questionnaire – your

answers will be really useful in helping us to improve the system.

Page 69: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 68 UNEW

Tourists/Unfamiliar Users

Section 1 - About you

Are you: Male Female

To what age group do you belong?

18 to 29 30 to 39 40 to 49 50 to 59 60 and over

How frequently do you travel by Public Transport in your home country?

Public Transport Mode

Daily Weekly Monthly Yearly Never

Bus (Local Services)

Coach (Express Services)

Underground/ Subway/Metro

Railway (Overground)

Other (please specify)

Please tick ONE BOX only for each mode in the table

How frequently do you travel by Public Transport when abroad (holiday or on

business)?

Public Transport Mode

Always Sometimes Occasionally Rarely Never

Bus (Local Services)

Coach (Express Services)

Underground/ Subway/Metro

Railway (Overground)

Other (please specify)

Please tick ONE BOX only for each mode in the table

Page 70: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 69 UNEW

Do you use a mobile device as a regular means of communication? (e.g. Mobile

Phone, PDA, iPhone, etc.)?

Yes No

Have you ever been involved in a transport emergency (of any type)?

Yes No

Have you received any training or instructions about how to behave when involved in

a transport emergency?

Yes (Please describe below) No

Section 2 - Opinions on the SAVE ME System

Please think back to the exercises in which you have just participated and the system

you have used or seen in action. Please rate the overall SAVE ME system by giving

one mark on the follow nine scales:

1 Useful Useless

2 Pleasant Unpleasant

3 Bad Good

4 Nice Annoying

5 Effective Superfluous

6 Irritating Likeable

7 Assisting Worthless

8 Undesirable Desirable

9 Exciting Boring

Page 71: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 70 UNEW

Please now think about how you felt when using the SAVE ME system for evacuation

purposes and mark one box on each line:

Strongly Disagree

Disagree Neutral Agree Strongly Agree

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in the system were well integrated

6. I thought there was too much inconsistency in this system

7. I imagine that most people would learn to use this system very quickly

8. I found the system very awkward to use

9. I felt very confident using the system

10. I needed to learn a lot of things before I could get going with this system

11. Overall, I would rate the user-friendliness of this system as

Worst Imaginable

Awful Poor OK Good Excellent Best

Imaginable

Page 72: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 71 UNEW

Please now complete the following table based upon your trust of the SAVE ME

system to help you in an emergency, where 0 = not at all and 6 = definitely. Again,

please only provide one mark per line.

I felt that:

Not at All

> > Maybe > > Definitely

0 1 2 3 4 5 6

a) The system was useful

b) The system was reliable

c) The system worked accurately

d) The system was

understandable

e) The system worked robustly

f) I was confident when using the

system

Based upon your experiences of travelling, in particular your involvement in any

previous transport emergencies, do you think the SAVE ME system could improve

the general evacuation procedure of travellers in emergency situations?

Yes No

Please state why you think this:

Do you think having the SAVE ME system would change the way you would behave

if faced with an emergency situation?

Yes No

Please state why you think this:

Page 73: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 72 UNEW

Section 3 - Future Acceptance of the SAVE ME System

Overall, on a scale of 0 (would not use) to 10 (would definitely use), what score

would you give the SAVE ME system in terms of its future acceptance by people like

yourself?

Wo

uld

No

t U

se

> > > >

Wo

uld

Use

> > > >

Wo

uld

De

finite

ly

Use

0 1 2 3 4 5 6 7 8 9 10

Please imagine that the SAVE ME application has been developed for mobile

devices. On a scale of 0 (definitely not) to 10 (definitely would), how willing would you

be to have this application on your mobile device?

De

finite

ly

No

t

> > > >

Po

ssib

ly

> > > >

De

finite

ly

Wo

uld

0 1 2 3 4 5 6 7 8 9 10

How would you like this application to be made available? (Please tick ONE option

only)

I am notified by the system that the application is available, and has automatically been downloaded/installed on my mobile device

I am notified by the system that the application is available, then decide to download/install it myself

I am not notified by the system that the application is available, but search for and download/install it myself

If you chose to download it yourself, would you be willing to pay for this application?

Yes No

Page 74: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 73 UNEW

If YES, how much would you be willing to pay?

£0.00 £0.50 £1.00 £1.50 £2.00 £2.50 £3.00 £3.50 £4.00 £4.50 £5.00+

€0.00 €0.50 €1.00 €1.50 €2.00 €2.50 €3.00 €3.50 €4.00 €4.50 €5.00+

Do you have any further comments about the system that has not been addressed so

far?

On behalf of the SAVE ME consortium, we would like to thank you for taking part in

the SAVE ME pilot tests and for taking the time to complete this questionnaire. All

answers and views provided here will be non-attributable and remain totally

anonymous. All results in the final reports will be summaries of the data collected.

Page 75: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 74 UNEW

Infrastructure Operators

Section 1 – About You

Organisation/Company ________________________________

Position/Job Title ________________________________

Relevant experiences, main roles and responsibilities relating to Transport Safety

Issues

Section 2 - Opinions on the SAVE ME System

Please think back to the exercises in which you have just participated and the system

you have used or seen in action. Please rate the overall SAVE ME system by giving

one mark on the follow nine scales:

1 Useful Useless

2 Pleasant Unpleasant

3 Bad Good

4 Nice Annoying

5 Effective Superfluous

6 Irritating Likeable

7 Assisting Worthless

8 Undesirable Desirable

9 Exciting Boring

Page 76: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 75 UNEW

Please now think about how you felt when using the SAVE ME system during the

evacuation exercises and mark one box on each line:

Strongly Disagree

Disagree Neutral Agree Strongly Agree

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in the system were well integrated

6. I thought there was too much inconsistency in this system

7. I imagine that most people would learn to use this system very quickly

8. I found the system very awkward to use

9. I felt very confident using the system

10. I needed to learn a lot of things before I could get going with this system

11. Overall, I would rate the user-friendliness of this system as

Worst Imaginable

Awful Poor OK Good Excellent Best

Imaginable

Do you have any further comments about the system usability?

Page 77: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 76 UNEW

Situational Awareness

Please now complete the following table based upon your use of the SAVE ME

system during the emergency exercises, where 0 = never and 6 = always. Again,

please only provide one mark per line.

During the SAVE ME exercises:

Never > > Sometimes > > Always

0 1 2 3 4 5 6

a) I was ahead of the events

b) I started to focus on a single

problem or specific task

c) There was a risk of forgetting

something important

d) I was able to plan and

organise my tasks as I wanted

e) I was surprised by an unexpected

event

f) I had to search for an item of information

Do you have any further comments about the system’s assistive capabilities?

Page 78: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 77 UNEW

Workload

Please now complete the following table based upon the amount of effort required to

use the SAVE ME system during the emergency exercises, where 0 = no effort at all

and 6 = extreme amounts of effort. Again, please only provide one mark per line.

During the SAVE ME exercise, how much effort did it take to

No Effort

> > Some Effort

> > Extreme

Effort

0 1 2 3 4 5 6

1) Prioritise tasks?

2) Identify potential conflicts?

3) Scan the display?

4) Evaluate mitigation options against the situation and other conditions?

5) Anticipate future situations?

6) Recognise a mismatch of available data with the actual situation?

7) Issue timely commands?

8) Evaluate the consequences of a plan?

9) Manage information?

10) Share information with team members?

11) Recall necessary information?

12) Anticipate team members’ needs?

13) Prioritise requests?

14) Scan progress of exercise?

15) Access relevant information?

16) Gather and interpret information?

Page 79: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 78 UNEW

Section 3 - Future Acceptance of the SAVE ME System

In this section, please imagine that the SAVE ME system has been developed for

actual use in real-life emergency situations. On a scale of 0 (definitely not) to 10

(definitely would), how willing would you be to use SAVE ME in your role during an

emergency?

De

finite

ly

No

t

> > > >

Po

ssib

ly

> > > >

De

finite

ly

Wo

uld

0 1 2 3 4 5 6 7 8 9 10

Based upon your experiences, in particular your involvement in any previous

transport emergencies, do you think the SAVE ME system could improve the general

evacuation procedure in emergency situations?

Yes No

Please state why you think this:

Do you have any further comments about the system that have not been addressed

so far?

On behalf of the SAVE ME consortium, we would like to thank you for taking part in

the SAVE ME pilot tests and for taking the time to complete this questionnaire. All

answers and views provided here will be non-attributable and remain totally

anonymous. All results in the final reports will be summaries of the data collected.

Page 80: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 79 UNEW

Rescue Personnel

Section 1 – About You

Name of organisation ________________________________

Position within organisation ________________________________

How many years have you been working in the rescue services? __________ years

What are your main responsibilities/tasks within your organisation?

Please (briefly) describe the training have you received for dealing with emergency situations:

Page 81: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 80 UNEW

Section 2 - Opinions on the SAVE ME System

Please think back to the exercises in which you have just participated and the system

you have used or seen in action. Please rate the overall SAVE ME system by giving

one mark on the follow nine scales:

1 Useful Useless

2 Pleasant Unpleasant

3 Bad Good

4 Nice Annoying

5 Effective Superfluous

6 Irritating Likeable

7 Assisting Worthless

8 Undesirable Desirable

9 Exciting Boring

Please now think about how you felt when using the SAVE ME system during the

evacuation exercises and mark one box on each line:

Strongly Disagree

Disagree Neutral Agree Strongly Agree

1. I think that I would like to use this system frequently

2. I found the system unnecessarily complex

3. I thought the system was easy to use

4. I think that I would need the support of a technical person to be able to use this system

5. I found the various functions in the system were well integrated

6. I thought there was too much inconsistency in this system

7. I imagine that most people would learn to use this system very quickly

8. I found the system very awkward to use

9. I felt very confident using the system

10. I needed to learn a lot of things before I could get going with this system

11. Overall, I would rate the user-friendliness of this system as

Worst Imaginable

Awful Poor OK Good Excellent Best

Imaginable

Page 82: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 81 UNEW

Situational Awareness

Please now complete the following table based upon your use of the SAVE ME

system during the emergency exercises, where 0 = never and 6 = always. Again,

please only provide one mark per line.

During the SAVE ME exercises:

Never > > Sometimes > > Always

0 1 2 3 4 5 6

a) I was ahead of the events

b) I started to focus on a single

problem or specific task

c) There was a risk of forgetting

something important

d) I was able to plan and

organise my tasks as I wanted

e) I was surprised by an unexpected

event

f) I had to search for an item of information

Do you have any further comments about the system usability?

Page 83: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 82 UNEW

Enhancing Team Working and Communications

Please now complete the following table based upon your use of the SAVE ME

system during the emergency exercises, where 0 = never and 6 = always. Again,

please only provide one mark per line. During the SAVE ME exercises:

Never > >

Some- times

> > Always

0 1 2 3 4 5 6

1) It was clear to me which tasks were my responsibility

2) It was clear to me which tasks were carried out by the other team members

3) It was clear to me which tasks I shared with the other team members

4) The system enabled the team to prioritise tasks efficiently

5) The system helped the team to synchronise their actions

6) The goals of the team were clearly defined

7) The system promoted a smooth flow of information

8) The system helped the team to follow the procedures

9) The system helped me to detect the other team members’ inaccuracies or mistakes

10) The system helped me to share information about developing situations with others

11) I liked working in the team

12) I felt supported by the other team members

Page 84: DELIVERABLE 8 - TRIMIS...2013/01/29  · system in WP8. The plans will be based upon test scenarios, which are designed to demonstrate all components of the disaster detection and

SAVE ME Deliverable 8.1 PU Contract N. 234027

December 2010 83 UNEW

Section 3 - Future Acceptance of the SAVE ME System

In this section, please imagine that the SAVE ME system has been developed for

actual use in real-life emergency situations. On a scale of 0 (definitely not) to 10

(definitely would), how willing would you be to use SAVE ME in your role during an

emergency?

De

finite

ly

No

t

> > > >

Po

ssib

ly

> > > >

De

finite

ly

Wo

uld

0 1 2 3 4 5 6 7 8 9 10

Based upon your experiences, in particular your involvement in any previous

transport emergencies, do you think the SAVE ME system could improve the general

evacuation procedure in emergency situations?

Yes No

Please state why you think this:

What additions (if any) would you like to see included in future developments of

SAVE ME system?

Do you have any further comments about the system that has not been addressed so

far?

On behalf of the SAVE ME consortium, we would like to thank you for taking part in the SAVE ME pilot tests and for taking the time to complete this questionnaire. All answers and views provided here will be non-attributable and remain totally anonymous. All results in the final reports will be summaries of the data collected.