21
Scientific Development Branch Dataset Production and Performance Evaluation for Event Detection and Tracking Paul Hosmer Detection and Vision Systems Group

Dataset Production and Performance Evaluation for Event Detection and Tracking

  • Upload
    chenoa

  • View
    60

  • Download
    0

Embed Size (px)

DESCRIPTION

Dataset Production and Performance Evaluation for Event Detection and Tracking. Paul Hosmer Detection and Vision Systems Group. Outline. Defining a requirement What to include in datasets Constraints Evaluation and Metrics Case Study. Background. Intelligent Video - PowerPoint PPT Presentation

Citation preview

Page 1: Dataset Production and Performance Evaluation for Event Detection and Tracking

Scientific DevelopmentBranch

Dataset Production and Performance Evaluation for Event Detection and Tracking

Paul Hosmer Detection and Vision Systems Group

Page 2: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Outline

• Defining a requirement

• What to include in datasets

• Constraints

• Evaluation and Metrics

• Case Study

Page 3: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Intelligent Video

– Started in early 1990’s – FABIUS – Amethyst– Through to 2000’s – VMD capability study– Standards-based evaluations

Background

Page 4: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

What did we want to achieve?

• Test systems in short period of time

• Provide data and requirements to research community

Dataset production

Problem: what to include?

Page 5: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Scenario definition• What is an event?

• Where does the scenario take place?

• What challenges are posed by the environment?

Ask end users / gauge demand

Conduct capability study

Monitor environment, apply a priori knowledge

Page 6: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Scenario definition

• Abandoned Baggage

• When is an object

abandoned?

• What types of object?

• Attributes of person?

Page 7: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Scenario definition“Abandoned object”

• During the current clip, a person has placed an object which was in

their possession when they entered the clip onto the floor or a seat

in the detection area &

• That person has left the detection area without the object &

• Over sixty seconds after they left the detection area, that person has

still not returned to the object &

• The object remains in the detection area.

Page 8: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Scenario definition• Key environmental factors:

• Lighting changes – film

dawn and dusk

• Rain and snow

• Night – head lights and low

SNR

Page 9: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

How much data?

• Need to demonstrate

performance on wide range of

imagery

• Statistical significance

• Need large training and test

corpus – 100’s of events

• Unseen data for verification

Page 10: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Constraints• You can’t always capture the event you want – simulation

• Make simulations as close to the requirement as possible

• Storage vs image quality – what will you want to do with

the data at a later time?

• Cost – try to film as much variation/events as you can

Page 11: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Performance Evaluation• Importance of metrics – consistency across different evaluations

• When is an event detected?

• Real time evaluation, 10x real time, offline… which is most useful?

• Statistically significant unseen dataset:

Performance on training data does not tell me anything useful about

robustness

Page 12: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

How HOSDB does it• Simulate real analogue

CCTV system

• ~ 215,000 frames per

scenario evaluation

• ~ Evaluation 300 events

• 60s to alarm after GT alarm

condition is satisfied

• One figure of merit for

ranking

Page 13: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

F1 score for event detection

F1 = (α + 1)RP R + αP

α ranges from 0.35 to 75 depending on scenario and

application

R = TP

TP+FNP = TP

TP+FP

where

Page 14: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

What about Tracking?55thth i-LIDS scenario i-LIDS scenario– Multiple Camera Tracking– Increasing interest from end users – Significant potential to enhance operator effectiveness and

aid post event investigation

The ProblemThe Problem– Unifying tracking labelling across multiple camera views

Dataset and Evaluation ProblemDataset and Evaluation Problem– Synchronisation

Page 15: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Operational RequirementCamera Requirements:Camera Requirements:– Existing CCTV systems– Cameras are a mixture of overlapping and non-overlapping– Internal cameras are generally fixed and colour

Scene Contents:Scene Contents:– Scenes are likely to contain rest points– Varying traffic densities

Target Description:Target Description:– There may be multiple targets– Targets from wide demographic

Page 16: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Imagery Collection

Page 17: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Imagery CollectionLocationLocation– Large Transport Hub

(airport)

TargetsTargets– Varied Targets– Differing target behaviour– Varying crowd densities

EnvironmentEnvironment– Lighting changes– Filmed at Dawn, Day, Dusk

and Night

VolumeVolume– 5 cameras– 1.35 Million frames– Single and multiple target– 1000+ target events– 1TB external HDD

Page 18: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Page 19: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Dataset structure

Target Event Set

MCT01

MCT02

MCT03

TES Properties

DaytimeHigh density

Night timeLow density

EtcEtc

Overlapping Stage Non-Overlapping Stage

Mixed Stage

Page 20: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

Performance Metric

F1 = 2RP R+P

– F1 must be greater than or equal to 0.25 for the track to be a True Positive

P = Overlapping Pixels Total Track Pixels

R = Overlapping Pixels Total Ground Truth Pixels

Page 21: Dataset Production and Performance Evaluation for Event Detection and Tracking

SCIENTIFIC DEVELOPMENT BRANCHPAUL HOSMER BMVA Performance Evaluation Symposium 2007

– Performance evaluation is important– Evaluations need to use more data– With richer content– With widely accepted definitions and

metrics

– Demonstrate improved performance