24
The 10th annual Northeast Regional Operational Workshop, Albany, NY Verification of SREF Verification of SREF Aviation Forecasts at Aviation Forecasts at Binghamton, NY Binghamton, NY Justin Arnott Justin Arnott NOAA / NWS Binghamton, NY NOAA / NWS Binghamton, NY

Verification of SREF Aviation Forecasts at Binghamton, NY

  • Upload
    paco

  • View
    34

  • Download
    0

Embed Size (px)

DESCRIPTION

Verification of SREF Aviation Forecasts at Binghamton, NY. Justin Arnott NOAA / NWS Binghamton, NY. Motivation. Ensemble information making impacts throughout the forecast process Examples SREFs, MREFs, NAEFS, ECMWF Ensemble - PowerPoint PPT Presentation

Citation preview

Page 1: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Verification of SREF Verification of SREF Aviation Forecasts at Aviation Forecasts at

Binghamton, NYBinghamton, NY

Justin ArnottJustin Arnott

NOAA / NWS Binghamton, NOAA / NWS Binghamton, NYNY

Page 2: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

MotivationMotivation

Ensemble information making impacts Ensemble information making impacts throughout the forecast processthroughout the forecast process Examples SREFs, MREFs, NAEFS, ECMWF EnsembleExamples SREFs, MREFs, NAEFS, ECMWF Ensemble

SREF resolution is reaching the mesoscale (32-SREF resolution is reaching the mesoscale (32-45 km), a scale at which some aviation impacts 45 km), a scale at which some aviation impacts may be resolvablemay be resolvable

Can SREFs provide useful information in the Can SREFs provide useful information in the aviation forecast process?aviation forecast process?

Page 3: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

SREFSREF

21 member multi-model ensemble21 member multi-model ensemble 10 ETA(32 km, 60 vertical levels)10 ETA(32 km, 60 vertical levels) 3 NCEP - NMM WRF (40 km, 52 vertical 3 NCEP - NMM WRF (40 km, 52 vertical

levels)levels) 3 NCAR - ARW WRF (45 km, 35 vertical 3 NCAR - ARW WRF (45 km, 35 vertical

levels)levels) 5 NCEP - RSM (45 km, 28 vertical levels)5 NCEP - RSM (45 km, 28 vertical levels)

Various IC/BCs, physical Various IC/BCs, physical parameterizationsparameterizations

Page 4: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

SREF Aviation ForecastsSREF Aviation Forecasts Numerous Aviation parameters are Numerous Aviation parameters are

created from the 9 and 21Z simulationscreated from the 9 and 21Z simulations

For creating TAFs, For creating TAFs, CIG/VSBY fields may CIG/VSBY fields may provide the most potential provide the most potential useuse http://wwwt.emc.ncep.noaa.gov/mmb/SREF/

SREF.html

•Some outputted Some outputted directly, directly, some derivedsome derived

•Include: CIG, VSBY, Include: CIG, VSBY, icing, icing, turbulence, jet stream, turbulence, jet stream, shear, shear, convection, convection, precipitation, precipitation, freezing level, fog, etc. freezing level, fog, etc.

Page 5: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

SREF Aviation ForecastsSREF Aviation Forecasts

Verification of SREF CIG/VSBY Verification of SREF CIG/VSBY forecasts has been minimalforecasts has been minimal Alaska Region has completed a study using Alaska Region has completed a study using

SREF MEAN valuesSREF MEAN values

No verification study has been No verification study has been conducted over the lower 48conducted over the lower 48

Page 6: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

ExpectationsExpectations CIGS/VSBYS, can vary greatly on scales far less than CIGS/VSBYS, can vary greatly on scales far less than

the 32-45 km scale of the SREFsthe 32-45 km scale of the SREFs

Some MVFR/IFR events Some MVFR/IFR events

are more localized than othersare more localized than others Summer MVFR/IFR tends Summer MVFR/IFR tends

to be more localizedto be more localized Winter MVFR/IFR isWinter MVFR/IFR is

typically more widespreadtypically more widespread

Bottom Line:Bottom Line: Expect relatively poor SREF Expect relatively poor SREF performance during the warm season, with performance during the warm season, with improvements during the cool seasonimprovements during the cool season

~40 km

Page 7: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

The Study So Far…The Study So Far…

Gather SREF CIG/VISBY data daily starting Gather SREF CIG/VISBY data daily starting July 1, 2008July 1, 2008 Data provided specifically for the project by Binbin Data provided specifically for the project by Binbin

Zhou at NCEPZhou at NCEP

Compute POD/FAR/CSI/BIAS statistics for Compute POD/FAR/CSI/BIAS statistics for July-September at KBGM July-September at KBGM MVFR and IFR (due to small sample size)MVFR and IFR (due to small sample size) Investigate using different probabilities to base Investigate using different probabilities to base

forecast on forecast on 50%, 30%, 20%, 10%50%, 30%, 20%, 10%

Page 8: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

The Study So Far…The Study So Far…continuedcontinued

Compare SREF results to WFO Compare SREF results to WFO Binghamton, NY and GFS MOS Binghamton, NY and GFS MOS forecastsforecasts Use stats-on-demand web interface to Use stats-on-demand web interface to

obtain this dataobtain this data

Page 9: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

ResultsResults

Very little MVFR/IFR at KBGM in July-Very little MVFR/IFR at KBGM in July-SeptemberSeptember IFR or MVFR only ~10% of the timeIFR or MVFR only ~10% of the time

So, we’re aiming at a very small target!So, we’re aiming at a very small target!

Page 10: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – MVFR/IFR CIGSResults – MVFR/IFR CIGS

SRESREF F

MeaMeann

SRESREFF

50%50%

SRESREFF

30%30%

SRESREFF

20%20%

SRESREFF

10%10%

BGBGMM

GFSGFS

MOMOSS

PODPOD 0.610.61 0.480.48 0.750.75 0.860.86 0.990.99 0.720.72 0.600.60

FARFAR 0.660.66 0.460.46 0.550.55 0.660.66 0.740.74 0.340.34 0.370.37

CSICSI 0.270.27 0.340.34 0.390.39 0.320.32 0.260.26 0.530.53 0.440.44

BIABIASS

1.801.80 0.890.89 1.671.67 2.572.57 3.843.84 1.081.08 0.960.96

Page 11: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – MVFR/IFR CIGSResults – MVFR/IFR CIGS

WFO BGM/GFS MOS more skillful than WFO BGM/GFS MOS more skillful than the SREF mean or any SREF probability the SREF mean or any SREF probability thresholdthreshold

30% probability threshold shows best skill30% probability threshold shows best skill Large false alarm ratios with nearly all Large false alarm ratios with nearly all

SREF forecastsSREF forecasts Large positive biases for SREF mean and Large positive biases for SREF mean and

nearly all probability thresholdsnearly all probability thresholds IE over forecasting MVFR/IFR CIGSIE over forecasting MVFR/IFR CIGS

Page 12: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Comparing Apples with Comparing Apples with Oranges?Oranges?

These results compare 9-21 hr SREF These results compare 9-21 hr SREF forecasts with 0-6 hour WFO BGM forecasts with 0-6 hour WFO BGM forecasts and 6-12 hr GFS forecastsforecasts and 6-12 hr GFS forecasts Due to later availability of SREF data (9Z Due to later availability of SREF data (9Z

SREFS not available for use until 18Z SREFS not available for use until 18Z TAFs)TAFs)

How well does a 9-24 hr GFS MOS (or How well does a 9-24 hr GFS MOS (or BGM) forecast perform?BGM) forecast perform? 21 hr not available using stats-on-demand21 hr not available using stats-on-demand

Page 13: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – MVFR/IFR CIGSResults – MVFR/IFR CIGS

SRESREF F

MeaMeann

SRESREFF

50%50%

SRESREFF

30%30%

SRESREFF

20%20%

SRESREFF

10%10%

BGMBGM

9-9-24hr24hr

GFSGFS

MOSMOS

9-9-24hr24hr

PODPOD 0.610.61 0.480.48 0.750.75 0.860.86 0.990.99 0.630.63 0.640.64

FARFAR 0.660.66 0.460.46 0.550.55 0.660.66 0.70.744

0.330.33 0.270.27

CSICSI 0.270.27 0.340.34 0.390.39 0.320.32 0.260.26 0.480.48 0.520.52

BIABIASS

1.801.80 0.890.89 1.671.67 2.572.57 3.83.844

0.930.93 0.880.88

Page 14: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – MVFR/IFR CIGSResults – MVFR/IFR CIGS

WFO BGM / GFS MOS performance WFO BGM / GFS MOS performance does not decrease substantially by does not decrease substantially by changing the comparison time changing the comparison time windowwindow

Page 15: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – MVFR/IFR VSBYSResults – MVFR/IFR VSBYS

SRESREF F

MeaMeann

SRESREFF

30%30%

SRESREFF

20%20%

SRESREFF

10%10%

BGMBGM GFSGFS

MOSMOSBGMBGM

9-9-24hr24hr

GFSGFS

MOSMOS

9-9-24hr24hr

PODPOD 0.040.04 0.140.14 0.310.31 0.580.58 0.660.66 0.650.65 0.520.52 0.540.54

FARFAR 0.000.00 0.460.46 0.590.59 0.640.64 0.530.53 0.580.58 0.600.60 0.630.63

CSICSI 0.040.04 0.120.12 0.210.21 0.280.28 0.380.38 0.340.34 0.290.29 0.280.28

BIABIASS

0.040.04 0.250.25 0.740.74 1.621.62 1.421.42 1.531.53 1.301.30 1.461.46

Page 16: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results MVFR/IFR Results MVFR/IFR VSBYSVSBYS

SREF Mean as well as 30 and 20% SREF Mean as well as 30 and 20% thresholds fail to identify enough thresholds fail to identify enough cases to be usefulcases to be useful

10% threshold shows greatest skill 10% threshold shows greatest skill and is comparable to GFS MOS and is comparable to GFS MOS forecasts!forecasts! There is a significant positive bias at There is a significant positive bias at

this thresholdthis threshold

Page 17: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – IFR CIGSResults – IFR CIGSSRESRE

F F MeaMea

nn

SRESREFF

30%30%

SRESREFF

20%20%

SRESREFF

10%10%

BGMBGM GFSGFS

MOSMOSBGMBGM

9-9-24hr24hr

GFSGFS

MOSMOS

9-9-24hr24hr

PODPOD 0.290.29 0.480.48 0.580.58 0.830.83 0.510.51 0.530.53 0.250.25 0.380.38

FARFAR 0.420.42 0.420.42 0.620.62 0.730.73 0.240.24 0.480.48 0.400.40 0.480.48

CSICSI 0.240.24 0.360.36 0.300.30 0.260.26 0.440.44 0.360.36 0.220.22 0.280.28

BIABIASS

0.500.50 0.830.83 1.501.50 3.033.03 0.670.67 1.001.00 0.420.42 0.730.73

Page 18: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – IFR CIGSResults – IFR CIGS

SREF Mean poor at identifying IFR SREF Mean poor at identifying IFR CIGSCIGS

CSI scores for SREF probability fields CSI scores for SREF probability fields are an are an improvementimprovement on WFO on WFO BGM/GFS MOSBGM/GFS MOS Bias scores indicate underforecasting at a Bias scores indicate underforecasting at a

30% threshold but large overforecasting 30% threshold but large overforecasting for 20,10% thresholdsfor 20,10% thresholds

WFO BGM/ GFS MOS tend to WFO BGM/ GFS MOS tend to underforecast IFR CIGSunderforecast IFR CIGS

Page 19: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – IFR VSBYSResults – IFR VSBYSSRESRE

F F MeaMea

nn

SRESREFF

30%30%

SRESREFF

20%20%

SRESREFF

10%10%

BGMBGM GFSGFS

MOSMOSBGMBGM

9-9-24hr24hr

GFSGFS

MOSMOS

9-9-24hr24hr

PODPOD 0.040.04 0.150.15 0.260.26 0.670.67 0.450.45 0.290.29 0.280.28 0.290.29

FARFAR 00 0.780.78 0.880.88 0.870.87 0.460.46 0.700.70 0.620.62 0.760.76

CSICSI 0.040.04 0.100.10 0.090.09 0.120.12 0.330.33 0.180.18 0.190.19 0.150.15

BIABIASS

0.040.04 0.670.67 2.222.22 5.255.25 0.850.85 0.970.97 0.730.73 1.221.22

Page 20: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Results – IFR VSBYSResults – IFR VSBYS

SREF can only readily identify IFR SREF can only readily identify IFR VSBY situations except at the 10% VSBY situations except at the 10% thresholdthreshold Tremendous biases indicate, however Tremendous biases indicate, however

that these forecasts are not usefulthat these forecasts are not useful

Page 21: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

SummarySummary

SREF performance occasionally comparable SREF performance occasionally comparable to GFS MOS to GFS MOS potentially useful guidance potentially useful guidance Promising for “~direct” model outputPromising for “~direct” model output Hampered by later arrival time at WFOHampered by later arrival time at WFO MEAN fields show little/no skillMEAN fields show little/no skill Different probability thresholds show best skill for Different probability thresholds show best skill for

different variables/categories different variables/categories

CIGS: CIGS: SREFS frequently over forecast MVFR/IFR CIGSSREFS frequently over forecast MVFR/IFR CIGS SREFS perform SREFS perform surprisingly wellsurprisingly well with IFR CIGS with IFR CIGS Best performing probability thresholds are 20-Best performing probability thresholds are 20-

30% balancing BIAS with CSI30% balancing BIAS with CSI

Page 22: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Summary, continuedSummary, continued

VSBYS:VSBYS: SREFS have trouble identifying VSBY SREFS have trouble identifying VSBY

restrictionsrestrictions 10% probability threshold necessary to 10% probability threshold necessary to

get any signal, but this may be useful get any signal, but this may be useful for MVFR/IFR (not IFR alone)for MVFR/IFR (not IFR alone)

Page 23: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

Future PlansFuture Plans

Continue computing statistics through Continue computing statistics through the upcoming cool seasonthe upcoming cool season Expect improved results given more Expect improved results given more

widespread (i.e. resolvable) restrictionswidespread (i.e. resolvable) restrictions Expand to other WFO BGM TAF sitesExpand to other WFO BGM TAF sites

Work with NOAA/NWS/NCEP in Work with NOAA/NWS/NCEP in improving calculations of CIG/VSBY improving calculations of CIG/VSBY

Page 24: Verification of SREF Aviation Forecasts at Binghamton, NY

The 10th annual Northeast Regional Operational Workshop, Albany, NY

AcknowledgementsAcknowledgements

Binbin Zhou – NOAA/NWS/NCEPBinbin Zhou – NOAA/NWS/NCEP For providing access to SREF data in For providing access to SREF data in

near real-time near real-time