36
5 5820 58 Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Embed Size (px)

Citation preview

Page 1: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

5280 5340 5400 5460

5520

5580

5640

5700

5760

5820

5820

Mean Geoptential for Cluster 4

Object-oriented verification of WRF forecasts from 2005

SPC/NSSL Spring Program

Mike Baldwin

Purdue University

Page 2: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

References

• Baldwin et al. (2005): Development of an automated classification procedure for rainfall systems. Mon. Wea. Rev.

• Baldwin et al. (2006): Challenges in comparing realistic, high-resolution spatial fields from convective-scale grids. Symposium on the Challenges of Severe Convective Storms 2006 AMS Annual Meeting

• Baldwin et al. (2003): Development of an events-oriented verification system using data mining and image processing algorithms. AMS 2003 Annual Meeting 3rd Conf. Artificial Intelligence

Page 3: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Types of data

• gridded fields of precipitation or reflectivity

• GRIB format has been used

• program expects models and observations to be on the same grid

Page 4: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Basic strategy

• compare forecast “objects” with observed “objects”• objects are described by a set of attributes related to

morphology, location, intensity, etc.• multi-variate “distance” can be defined to measure

differences between “objects” combining all attributes• “good” forecasts will have small “distances”• “bad” forecasts will have large “distances”• errors for specific attributes for pairs of matching fcst/obs

objects can be analyzed

Page 5: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Method

• find areas of contiguous precipitation greater than a threshold

• expand those areas by ~20% and connect objects that are within 20km of each other

• characterize objects by location, mean, variance, size, measures of shape

• compare every forecast object to every observed object valid at the same time

Page 6: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Strengths of approach

• Attributes related to rainfall intensity and auto-correlation ellipticity were able to classify precipitation systems by morphology (linear/cellular/stratiform)

• Can define “similar” and “different” objects using as many attributes as deemed important or interesting

• Allows for event-based verification– categorical– errors for specific classes of precipitation events

Page 7: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Weaknesses of approach

• not satisfied with threshold-based object ID procedure

• sensitivity to choice of threshold

• currently no way to determine extent of “overlap” between objects

• does not include temporal evolution, just doing snapshots

Page 8: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Weaknesses, continued…

• Not clear how to “match” fcst & obs objects– how far off does a fcst have to be to be considered a

false alarm?

• How to weigh different attributes? – 250km spatial distance same as 5mm precipitation

distance?

• Do attribute distributions matter?– Forecast of heavy rain and observation of heavy rain

is a good forecast, even if magnitude off by 50%– 50% error in light rain forecast is more significant

Page 9: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

ExampleSTAGE II WRF2 CAPS

Page 10: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

1 mm threshold

STAGE II WRF2 CAPS

Page 11: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

5 mm threshold

STAGE II WRF2 CAPS

Page 12: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

analyze objects > (50 km)2

WRF2 CAPSSTAGE II

Area=12700 km2

mean(ppt)=6.5

ppt= 14.5

max corr @ 25km =0.06

2 corr @ 25 km =0.97

@ 25 km = 107°

lat = 39.6°N

lon = 95.1°W

Area=55200 km2

mean(ppt)=8.2

ppt= 30.6

max corr @ 25km =0.54

2 corr @ 25 km =0.83

@ 25 km = 107°

lat = 40.7°N

lon = 97.3°W

Page 13: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

ExampleSTAGE II WRF2 CAPS

Area=12700 km2

mean(ppt)=6.5

ppt= 14.5

max corr @ 25km =0.06

2 corr @ 25 km =0.97

@ 25 km = 107°

lat = 39.6°N

lon = 95.1°W

Area=55200 km2

mean(ppt)=8.2

ppt= 30.6

max corr @ 25km =0.54

2 corr @ 25 km =0.83

@ 25 km = 107°

lat = 40.7°N

lon = 97.3°W

Page 14: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

analyze objects > (50 km)2

WRF2 CAPSSTAGE II

Area=20000 km2

mean(ppt)=22.6

ppt= 535.5

max corr @ 25km =0.47

2 corr @ 25 km =0.82

@ 25 km = 149°

lat = 34.3°N

lon = 101.3°W

Area=60000 km2

mean(ppt)=9.8

ppt= 57.7

max corr @ 25km =0.43

2 corr @ 25 km =0.57

@ 25 km = 21°

lat = 40.0°N

lon = 99.6°W

Page 15: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

ExampleSTAGE II WRF2 CAPS

Area=20000 km2

mean(ppt)=22.6

ppt= 535.5

max corr @ 25km =0.47

2 corr @ 25 km =0.82

@ 25 km = 149°

lat = 34.3°N

lon = 101.3°W

Area=60000 km2

mean(ppt)=9.8

ppt= 57.7

max corr @ 25km =0.43

2 corr @ 25 km =0.57

@ 25 km = 21°

lat = 40.0°N

lon = 99.6°W

Page 16: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

New auto-correlation attributes

• Replaced ellipticity of AC contours with 2nd derivative of correlation in vicinity of max corr at specific lags (~25, 50, 75 km every ~10°)

Page 17: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Example

• Find contiguous regions of reflectivity

• Expand areas by 15%

• Connect regions within 20km

Page 18: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Example

• Analyze objects > 150 points (~3000 km2)

• Result: 5 objects

• Next, determine attributes for each object

Page 19: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Attributes

• Each object is characterized by a vector of attributes, with a wide variety of units, ranges of values, etc.

Size: area (number of grid boxes)

Location: lat, lon (degrees)

Intensity: mean, variance of reflectivity in object

Shape: difference between max-min auto-corr at 50, 100, 150 km lags

Orientation: angle of max auto-corr at 50, 100, 150 km lags

Page 20: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Object comparison

• each object is represented by a vector of attributes: x = (x1, x2, … ,xn)T

• similarity/dissimilarity measures – measure the amount of resemblance or distance between two vectors

• Euclidean distance: 2

1

),(

n

iii yxyxd

Page 21: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

LEPS

• Distance = 1 equates to difference between “largest” and “smallest” object for a particular attribute

• Linear for uniform dist (lat, lon, )

• Have to be careful with

• L1-norm: AC diff = 0.4

Fo=.08

Fo=.47

n

iii yxyxd

1

),(

Page 22: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

How to match observed and forecast objects?

= false alarm

F1

O2

O3

= missed event

Objects might “match” more than once…

If di* > dT then false alarm

If d*j > dT : missed event

…for each observed object, choose closest forecast object

dij = ‘distance’ between F i and O j

…for each forecast object, choose closest observed object

O1

F2

Page 23: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Example of object verf

Fcst_1

ARW 2km (CAPS) Radar mosaic

Obs_2Fcst_2

Obs_1

Object identification procedure identifies 4 forecast objects and 5 observed objects

Page 24: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Distances between objects

• Use dT = 4 as threshold

• Match objects, find false alarms, missed events

O_34 O_37 O_50 O_77 O_79

F_25 5.84 4.16 8.94 9.03 11.53

F_27 6.35 2.54 7.18 6.32 9.25

F_52 7.43 9.11 4.15 9.19 5.45

F_81 9.39 6.35 6.36 2.77 5.24

Page 25: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

= .07 = .08

= .04 = .22

= .04 = -.07

median position errors

matching obs object given a forecast object

NMM4

ARW2 ARW4

Page 26: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University
Page 27: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Average distances for matching fcst and obs objects

• 1-30h fcsts, 10 May – 03 June 2004

• Eta (12km) = 2.12

• WRF-CAPS = 1.97

• WRF-NCAR = 1.98

• WRF-NMM = 2.02

Page 28: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

With set of matching obs and fcsts

• Nachamkin (2004) compositing ideas– errors given fcst event– errors given obs event

• Distributions of errors for specific attributes

• Use classification to stratify errors by convective mode

Page 29: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Automated rainfall object identification

• Contiguous regions of measurable rainfall (similar to Ebert and McBride 2000)

Page 30: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Connected component labeling

• Pure contiguous rainfall areas result in 34 unique “objects” in this example

Page 31: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Expand areas by 15%, connect regions that are within ~20 km

• Results in 5 objects

Page 32: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Useful characterization

• Attributes related to rainfall intensity and auto-correlation ellipticity were able to produce groups of stratiform, cellular, linear rainfall systems in cluster analysis experiments (Baldwin et al. 2005)

Page 33: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

New auto-correlation attributes

• Replaced ellipticity of AC contours with 2nd derivative of correlation in vicinity of max corr at specific lags (50, 100, 150km, every 10°)

Page 34: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

How to measure “distance” between objects

• How to weigh different attributes?– Is 250km spatial distance same as 5mm precipitation

distance?

• Do attribute distributions matter?– Is 55mm-50mm same as 6mm-1mm?

• How to standardize attributes?– X'=(x-min)/(max-min)– X'=(x-mean)/– Linear error in probability space (LEPS)

Page 35: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Estimate of dT threshold

• Compute distance between each observed object and all others at the same time

• dT = 25th percentile = 2.5

• Forecasts have similar distributions

25th %-ile

Page 36: Object-oriented verification of WRF forecasts from 2005 SPC/NSSL Spring Program Mike Baldwin Purdue University

Fcst_1

NCAR WRF 4km

Stage II radar ppt

Attributes

Area=70000 km2

mean(dBZ)=0.97

dBZ= 1.26

corr(50)=1.17

corr(100)=0.99

corr(150)=0.84

=173°

=173°

=173°

lat = 40.2°N

lon = 92.5°W

Fcst_2 Obs_1 Obs_2

Area=70000 km2

mean(ppt)=0.60

ppt= 0.67

corr(50)=0.36

corr(100)=0.52

corr(150)=0.49

=85°

=75°

=65°

lat = 44.9°N

lon =84.5°W

Area=135000

mean(ppt)=0.45

ppt= 0.57

corr(50)=0.37

corr(100)=0.54

corr(150)=0.58

=171°

=11°

=11°

lat = 39.9°N

lon = 91.2°W

Area=285000

mean(ppt)=0.32

ppt= 0.44

corr(50)=0.27

corr(100)=0.42

corr(150)=0.48

=95°

=85°

=85°

lat = 47.3°N

lon = 84.7°W

Obs_2

Obs_1