73
Introduction to Sensor Data Fusion Methods and Applications Last lecture: “Why Sensor Data Fusion?”’ Motivation, general context Discussion of examples oral examination: 6 credit points after the end of the semester prerequisite: participation in the excercises, running programs continuation in Summer: lectures and seminar on advanced topics job opportunities as research assistant in ongoing projects, practicum subsequently: master theses at Fraunhofer FKIE, PhD theses possible slides/script: email to [email protected], download Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Introduction to Sensor Data Fusion Methods and Applications€¦ · Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015 Target Tracking: Basic Idea, Demonstration

  • Upload
    others

  • View
    52

  • Download
    0

Embed Size (px)

Citation preview

Introduction to Sensor Data FusionMethods and Applications

• Last lecture: “Why Sensor Data Fusion?”’

– Motivation, general context– Discussion of examples

• oral examination: 6 credit points after the end of the semester

• prerequisite: participation in the excercises, running programs

• continuation in Summer: lectures and seminar on advanced topics

• job opportunities as research assistant in ongoing projects, practicum

• subsequently: master theses at Fraunhofer FKIE, PhD theses possible

• slides/script: email to [email protected], download

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Sensor & Information Fusion: Basic Task

-/

information sources: defined by operational requirements

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Sensor & Information Fusion: Basic Task

information to be fused: imprecise, incomplete, ambiguous, un-resolved, false, deceptive, hard to formalize, contradictory . . .

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Sensor & Information Fusion: Basic Task

information to be fused: imprecise, incomplete, ambiguous, un-resolved, false, deceptive, hard to formalize, contradictory . . .

information sources: defined by operational requirementsSensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

• single sensors/networkmeasurements

– kinematical parameters– classification attributes

• data processing/fusion

– temporal integration / logical analysis– statistical estimation / data association– combination with a priori information

• condensed information

– objects represented by “track structures”– quantitative accuracy/reliability measures

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

A Generic Tracking and Sensor Data Fusion System

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

- Object Environment- Object Characteristics

A Priori Knowledge:

- Sensor Performance

- Track Cancellation- Object Classification / ID- Track-to-Track Fusion

Track Processing:

- Interaction Facilities

Man-Machine Interface:

- Displaying Functions- Object Representation

Tracking & Fusion System

Sensor System Sensor System

SensorData

SensorControl

Sensor System

Track Extraction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

source of information: received waveforms (space-time)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

detection: a decision process for data rate reduction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

data fusion input: estimated target parameters, false plots

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

core function: associate sensor data to established tracks

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

track maintenance: updating of established target tracks

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

initiation: establish new tracks, re-initiate lost tracks

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

exploit available context information (e.g. topography)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

fusion/management of pre-processed track information

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

user interface: presentation, interaction, decision support

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Target Tracking: Basic Idea, Demonstration

Problem-inherent uncertainties and ambiguities!BAYES: processing scheme for ‘soft’, ‘delayed’ decision

sensor performance: • resolution conflicts • DOPPLER blindness

environment: • dense situations • clutter • jamming/deception

target characteristics: • qualitatively distinct maneuvering phases

background knowledge • vehicles on road networks • tactics

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

‘Probability densities functions (pdf)’ p(x

k�1|Zk�1) represent imprecise

knowledge on the ‘state’ xk�1 based on imprecise measurements Zk�1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(x

k

|Zk�1)

| {z }prediction

=

Rdx

k�1 p(x

k

|xk�1)| {z }

dynamics

p(x

k�1|Zk�1)

| {z }old knowledge

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

tk: kein plot

missing sensor detection: ‘data processing’ = prediction(not always: exploitation of ‘negative’ sensor evidence)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

Prädiktion: tk+1

missing sensor information: increasing knowledge dissipation

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

tk+1: ein plot

sensor information on the kinematical object state

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

Prädiktion: tk+1

likelihood(Sensormodell)

BAYES’ formula: p(x

k+1

|Zk+1

)

| {z }new knowledge

=

p(z

k+1

|xk+1

) p(x

k+1

|Zk

)

Rdx

k+1

p(z

k+1|{z}plot

|xk+1

) p(x

k+1

|Zk

)

| {z }prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

pdf: tk+1(Bayes)

filtering = sensor data processing

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

tk+1: drei plots

ambiguities by false plots: 1 + 3 data interpretation hypotheses(‘detection probability’, false alarm statistics)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

pdf: tk+1

Multimodal pdfs reflect ambiguities inherent in the data.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

pdf: tk+1

Prädiktion: tk+2

temporal propagation: dissipation of the probability densities

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

pdf: tk+1

tk+2: ein plot

association tasks: sensor data$ interpretation hypotheses

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

pdf: tk+1

Prädiktion: tk+2

likelihood

BAYES: p(x

k+2

|Zk+2

) =

p(z

k+2

|xk+2

) p(x

k+2

|Zk+1

)Rdx

k+2

p(z

k+2

|xk+2

) p(x

k+2

|Zk+1

)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

pdf: tk+1

pdf: tk+2

in particular: re-calculation of the hypothesis weights

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

pdf: tk+1

pdf: tk+2

How does new knowledge affect the knowledge in the past of a past state?

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

pdf: tk−1

pdf: tk

Retrodiktion: tk+1

pdf: tk+2

‘retrodiction’: a retrospective analysis of the past

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

tk−1tk

tk+1

tk+2

optimal information processing at present and for the past

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Multiple Hypothesis Tracking: Basic IdeaIterative updating of conditional probability densities!

kinematic target state x

k

at time t

k

, accumulated sensor data Zk

a priori knowledge: target dynamics models, sensor model, road maps

• prediction: p(x

k�1|Zk�1)

dynamics model����������!road maps

p(x

k

|Zk�1)

• filtering: p(x

k

|Zk�1)

sensor data Z

k����������!sensor model

p(x

k

|Zk

)

• retrodiction: p(x

l�1|Zk

)

filtering output ����������dynamics model

p(x

l

|Zk

)

– finite mixture: inherent ambiguity (data, model, road network )– optimal estimators: e.g. minimum mean squared error (MMSE)– initiation of pdf iteration: multiple hypothesis track extraction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Difficult Operational Conditions

object detection:– small objects: detection probability P

D

< 1– fading: consecutive missing plots (interference)– moving platforms: minimum detectable velocity

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Difficult Operational Conditions

object detection:– small objects: detection probability P

D

< 1– fading: consecutive missing plots (interference)– moving platforms: minimum detectable velocity

measurements:– false returns (residual clutter, birds, clouds)– low data update rates (long-range radar, e.g.)– measurement errors, overlapping gates

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Difficult Operational Conditions

object detection:– small objects: detection probability P

D

< 1– fading: consecutive missing plots (interference)– moving platforms: minimum detectable velocity

measurements:– false returns (residual clutter, birds, clouds)– low data update rates (long-range radar, e.g.)– measurement errors, overlapping gates

sensor resolution:– characteristic parameters band-/beam width– group measurements: resolution probability– important: qualitatively correct modeling

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Difficult Operational Conditions

object detection:– small objects: detection probability P

D

< 1– fading: consecutive missing plots (interference)– moving platforms: minimum detectable velocity

measurements:– false returns (residual clutter, birds, clouds)– low data update rates (long-range radar, e.g.)– measurement errors, overlapping gates

sensor resolution:– characteristic parameters band-/beam width– group measurements: resolution probability– important: qualitatively correct modeling

object behavior:– applications: high maneuvering capability– qualitatively distinct maneuvering phases– dynamic object parameters a priori unknown

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Demonstration: Multiple Hypothesis Tracking

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Tracking Application: Ground Picture Production

GMTI Radar: Ground Moving Target Indicator

wide area, all-weather, day/night, real-time surveillance ofa dynamically evolving ground or near-to-ground situation

GMTI Tracking: Some Characteristic Aspectsbackbone of a ground picture: moving target tracks

• airborne, dislocated, mobile sensor platforms• vehicles, ships, ‘low-flyers’, radars, convoys• occlusions: Doppler-blindness, topography• road maps, terrain information, tactical rules• dense target / dense clutter situations: MHT

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Examples of GMTI Tracks (live exercise)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

On Characterizing Tracking / Fusion Performance

a well-understood paradigm: air surveillance with multiple radars

Many results can be transfered to other sensors (IR, E/O, sonar, acoustics).

Sensor Data Fusion: ‘tracks’ represent the available information on the targetsassociated to them with appropriate quality measures, thus providing answers to:

When? Where? How many? To which direction? How fast, accelerating? What?

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

On Characterizing Tracking / Fusion Performance

a well-understood paradigm: air surveillance with multiple radars

Many results can be transfered to other sensors (IR, E/O, sonar, acoustics).

Sensor Data Fusion: ‘tracks’ represent the available information on the targetsassociated to them with appropriate quality measures, thus providing answers to:

When? Where? How many? To which direction? How fast, accelerating? What?

By sensor data fusion we wish to establish one-to-one associations between:

targets in the field of view $ identified tracks in the tracking computer

Strictly speaking, this is only possible under ideal conditions regarding the sensorperformance and underlying target situation. The tracking/fusion performance can thus

be measured by its deficiencies when compared with this ideal goal.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

1. Let a target be detected at first by a sensor at time t

a

. Usually, a timedelay is involved until a confirmed track has finally been establishedat time t

e

(track extraction). A ‘measure of deficiency’ is thus:

• extraction delay t

e

� t

a

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

1. Let a target be detected at first by a sensor at time t

a

. Usually, a timedelay is involved until a confirmed track has finally been establishedat time t

e

(track extraction). A ‘measure of deficiency’ is thus:

• extraction delay t

e

� t

a

.

2. Unavoidably, false tracks will be extracted in case of a high falsereturn density (e.g. clutter, jamming/detection), i.e. tracks related tounreal or unwanted targets. Corresponding ‘deficiencies’ are:

• mean number of falsely extracted targets per time,• mean life time of a false track before its deletion.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

1. Let a target be detected at first by a sensor at time t

a

. Usually, a timedelay is involved until a confirmed track has finally been establishedat time t

e

(track extraction). A ‘measure of deficiency’ is thus:

• extraction delay t

e

� t

a

.

2. Unavoidably, false tracks will be extracted in case of a high falsereturn density (e.g. clutter, jamming/detection), i.e. tracks related tounreal or unwanted targets. Corresponding ‘deficiencies’ are:

• mean number of falsely extracted targets per time,• mean life time of a false track before its deletion.

3. A target should be represented by one and the same track until leav-ing the field of view. Related performance measures/deficiencies:

• mean life time of tracks related to true targets,• probability of an ‘identity switch’ between targets,• probability of a target not being represented by a track.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

4. The track inaccuracy (error covariance of a state estimate) should beas small as possible. The deviations between estimated and actualtarget states should at least correspond with the error covariancesproduced (consistency). If this is not the case, we speak of a ‘trackloss’.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

4. The track inaccuracy (error covariance of a state estimate) should beas small as possible. The deviations between estimated and actualtarget states should at least correspond with the error covariances pro-duced (consistency). If this is not the case, we speak of a ‘track loss’.

• A track must really represent a target!

Challenges:

• low detection probability • high clutter density • low update rate• agile targets • dense target situations • formations, convoys• target-split events (formation, weapons) • jamming, deception

Basic Tasks:

• models: sensor, target, environment ! physics

• data association problems ! combinatorics

• estimation problems ! probability, statistics

• process control, realization ! computer science

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

• Approach: Interpret measurements and target vectors as random variables(RVs). Describe by probability density functions (pdf) what is known about them.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

• Approach: Interpret measurements and target vectors as random variables(RVs). Describe by probability density functions (pdf) what is known about them.

• Solution: Derive iteration formulae for calculating the pdfs! Develop a mech-anism for initiation! By doing so, exploit all background information available!Derive state estimates from the pdfs along with appropriate quality measures!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y)| {z }=p(y|x) p(x)

=

Rdy p(x|y) p(y): Enter y!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y) =

Rdy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y)

=

p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y) =

Rdy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y)

=

p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!

• certain knowledge on x: p(x) = �(x� y) ‘=’ lim�!0

1p2⇡�

e

�1

2

(x�y)2�

2

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y) =

Rdy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y)

=

p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!

• certain knowledge on x: p(x) = �(x� y) ‘=’ lim�!0

1p2⇡�

e

�1

2

(x�y)2�

2

• transformed RV y = t[x]: p(y) =

Rdx p(y, x) =

Rdx p(y|x) p

x

(x)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y) =

Rdy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y)

=

p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!

• certain knowledge on x: p(x) = �(x� y) ‘=’ lim�!0

1p2⇡�

e

�1

2

(x�y)2�

2

• transformed RV y = t[x]: p(y) =

Rdxp(y, x) =

Rdxp(y|x)p

x

(x) =

Rdx �(y � t[x]) p

x

(x) =: [T p

x

](y) (T : p

x

7! p, “transfer operator”)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Characterize an object by quantitativelydescribable properties: object state

Examples:

– object position x on a strait line: x 2 R– kinematic state x = (r

>, r

>, r

>)

>, x 2 R9

position r = (x, y, z)

>, velocity r, acceleration r

– joint state of two objects: x = (x

>1

,x

>2

)

>

– kinematic state x, object extension X

z.B. ellipsoid: symmetric, positively definite matrix

– kinematic state x, object class class

z.B. bird, sailing plane, helicopter, passenger jet, ...

Learn unknown object states from imperfect measurements anddescribe by functions p(x) imprecise knowledge mathematically precisely!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Interpret unknown object states as random variables, x [1D] or x,X [vector / matrixvariate]), characterized by corresponding probability density functions (pdf).

The concrete shape of the pdf p(x) contains the full knowledge on x!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Information on a random variable (RV) can be extractedby integration from the corresponding pdf. !

at present: one dimensional case:

How probable is it that x 2 (a, b) ✓ R holds?

Answer: P{x 2 (a, b)} =

Zb

a

dx p(x) ) p(x) � 0

in particular: P{x 2 R} =

Z 1

�1dx p(x) = 1 (normalzation)

intuitive interpretation: “the object is somewhere in R”

loosely: p(x) dx is probabity for x having a value between x and x+ dx

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV?

The maximum of the pdf is sometimes but not always useful!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV?

The maximum of the pdf is sometimes but not always useful! (! examples)

instead: Calculate the centroid of the pdf!

E[x] =Z 1

�1dx x p(x) = x “expectation value”

more generally: Consider functions g : x 7! g(x) of the RV x!

E[g(x)] =Z 1

�1dx g(x) p(x), “expectation value of the observable g”’

Example: Consider the observable 1

2

mx

2 (kinetic energy, x = speed)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

An important observable: the “error” of an estimate

• Quality: How useful is an expectation value x = E[x]?

Consider special obervables as distance measure:

g(x) = |x� x| oder g(x) = (x� x)

2

quadratic measures: computationally more comfortable!

‘expected error’ of the expectation value x:

V[x] = E[(x� x)

2

], �x

=

pV[x]

variance, standard deviation

Exercise 2.1Show that

V[x] = E[x2]� E[x]2holds.

Expectation value of the observable x

2 also called “2nd moment” of the pdf of x.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Exercise 2.2

Calculate expectation and variance of the uniform densityof a RV x 2 R in the intervall [a, b].

p(x) = U( x

|{z}ZV

; a, b

|{z}Parameter

) =

8<

:

1

b�a x 2 [a, b]

0 sonst!

Pdf correctly normalized?Z 1

�1dx U(x; a, b) =

1

b� a

Zb

a

dx = 1

E[x] =Z 1

�1dx x U(x; a, b) =

b+ a

2

V[x] = 1

b� a

Zb

a

dx x

2 � E[x]2 =

1

12

(b� a)

2

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Important example: x normally distributed over R (Gauß)

– wanted: probabilities concentrated around µ

– quadratic distance: ||x�µ||2 =

1

2

(x�µ)

2

/�

2 (mathematically convenient!)

– Parameter � is a measure of the “width” of the pdf: ||�||2 =

1

2

– for ‘large’ distances, i.e. ||x� µ||2 � 1

2

, the pdf shall decay quickly.

– simplest approach: p(x) = e

�||x�µ||2 (> 0 8x 2 R, normalization?)

– Normalized for p(x) = p(x)/

R1�1 dx p(x)!

Formula collection delivers:Z 1

�1dx p(x) =

p2⇡�

An admissible pdf with the required properties is obviously given by:

N (x;µ,�) =

1p2⇡�

exp

�(x� µ)

2

2�

2

!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Exercise 2.3Show for the Gauß density p(x) = N (x;µ,�):

E[x] = µ, V[x] = �

2

E[x] =Z 1

�1dx xN (x;µ,�) = µ

V[x] = E[x2]� E[x]2 = �

2

Use substitution and partial integration!

UseR1�1 dx e

�1

2

x

2

=

p2⇡!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

How to incorporate certain knowledge into pdfs?

Consider the special pdf: 1ly sharp peak at µ

�(x;µ) ‘=’

8<

:1 x = µ

0 x 6= µ

holding:Z 1

�1dx �(x;µ) = 1

Intuitively interpretable as a limit:

�(x;µ) ‘=’ lim

�!0

N (x;µ,�) = lim

�!0

1p2⇡�

e

�1

2

(x�µ)2�

2

Alternative: �(x; a) ‘=’ lim

b!a

U(x; a, b)

For observables g this holds: E[g(x)] =Z 1

�1dx g(x) �(x; y) = g(y)

in particular: V[x] = E[(x� E[x])2] = E[x2]� E[x]2 = 0

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015

Characterize an object by quantitativelydescribable properties: object state

Examples:

– object position x on a strait line: x 2 R X– kinematic state x = (r

>, r

>, r

>)

>, x 2 R9

position r = (x, y, z)

>, velocity r, acceleration r

– joint state of two objects: x = (x

>1

,x

>2

)

>

– kinematic state x, object extension X

z.B. ellipsoid: symmetric, positively definite matrix

– kinematic state x, object class class

z.B. bird, sailing plane, helicopter, passenger jet, ...

Learn unknown object states from imperfect measurements anddescribe by functions p(x) imprecise knowledge mathematically precisely!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 28, 2015