9
Drone Protect Task Working Paper MediaEval 2015 Atta Badii, Pavel Koshunov, Hamid Oudi Touradj Ebrahimi, Tomas Piatrik , Volker Eiselein , Natacha Ruchaud, Christian Fedorczak , Jean-Luc Dugelay, Diego Fernandez Vazquez 7

MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

Embed Size (px)

Citation preview

Page 1: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

Drone  Protect  Task    Working  Paper  MediaEval  2015  

Atta Badii, Pavel Koshunov, Hamid Oudi Touradj Ebrahimi, Tomas Piatrik , Volker Eiselein ,

Natacha Ruchaud, Christian Fedorczak , Jean-Luc Dugelay, Diego Fernandez Vazquez 7

Page 2: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

Task  Description

•  To explore the possibilities to optimise the process of privacy filtering so as to:

1.  obscure personal visual information effectively

whilst,

2. keeping as much as possible of the ‘useful’ information that would enable a human viewer to interpret the obscured video frame.

Slide 2

Page 3: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

•  Privacy  Protection  Level  – How adequate was the level of privacy protection achieved by the filter across all testing video clips?

•  Level  of  Intelligibility  – How much ‘useful’ information that was retained in the video frames after privacy filtering had been applied?

•  Pleasantness  of the resulting privacy filtered video frames in terms of their ‘aesthetic’ perceptual appeal to human viewers. How acceptable were any adverse aesthetic effects?

•  All the evaluation results sent out including overall and ranking results based on evaluators’ assigned weightings for the above 3 criteria

Slide 3

Privacy Filtering (side)Effects, Affects

Page 4: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

Produced in compliance with EU

Data Protection; Dataset includes: 38 FHD video clips, ~20 seconds each

showing Car Park Security Scenarios Pre-annotated to signify

Re-Indentifiability specificity of features

Low, Medium, High • Persons carrying specific items

(backpacks, umbrellas, wearing scarves) • Persons near/interacting with cars

• behaviours tagged as Normal (walking) Suspicious (loitering) and Illicit (stealing/ abandoning a car)

• 

Privacy sensitive region Sensitivity level Skin Medium (M) Face High (H)

Hair Low (L) Accessories Medium (M) Person Low (L)

DPT 2015 DATASET

Page 5: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

DPT 2015 DATASET

•  Ground Truth

•  has bounding boxes manually tagged LPII, MPII and HPII

•  License Plates(H), Skin (M), Face (H), Hair (L), Accessories (M), and for Person’s body (L).

Challenges

•  RoIs not annotated: the face-head “person-entering-a-car” event as covered in 2014

•  Persons at variable disytance from camera

•  Some jitter effects

Slide 5

Page 6: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

Evaluation Setup  •  Participants submitted privacy protected video

clips using the testing subset •  Evaluation of submitted solutions was based on

the human-perception of levels of i) privacy filtering, ii) retained information i.e., intelligibility and, iii) appropriateness (acceptability-attractiveness) of the privacy filtered High/Mdium/Low PII regions

•  6 Evaluators from security practitioner’s category and 11 from naïve category) evaluated by responding to 13 evaluation questions for each of the 3 randomly selected videos from each solution set

Slide 6

Page 7: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

Evaluation Process

•  A Questionnaire consisting of 12 questions had been carefully designed to examine aspects related to privacy, intelligibility, and pleasantness; this was used in stream 2 and 3.

•  The First (5) questions were aimed at eliciting the opinions of the evaluators re the Contents of the viewed videos. The responses to these questions were considered with respect to the ground truth.

•  The rest of the questions were aimed at eliciting the Subjective Opinions of the evaluators re the viewed videos.

•  The average score for all submission is illustrated in the following figure

Slide 7

Page 8: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

Stream  1  results  

Slide 8

Page 9: MediaEval 2015 - Overview of the MediaEval 2015 Drone Protect Task

Thank  You