15
Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

Embed Size (px)

Citation preview

Page 1: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

Project Phoenix – An Update

EuMetCAL Workshop

Toulouse, August 2008

Jaymie Gadal

Page 2: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 2 – April 21, 2023

Phoenix – The Experiment

• Original project Phoenix was an experiment:– To determine if the human forecaster

remained relevant in the forecast process– To determine if the forecaster retained a

usable measure of skill, without NWP– To assess the areas of strength and

weakness of the forecaster– To get an idea of what the Role of the

Forecaster should be

Page 3: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 3 – April 21, 2023

Phoenix Experiment - Methodology

• Forecasters were put in a parallel ‘office’ – access to NWP was denied

• Forecasts generated without NWP, but using the same preparation tool – Scribe

• 3 Outputs: automatic, official, and Phoenix team forecast verification results compared using very intensive scoring system

• Successes & Failures analyzed, in real time, and as trends over time

Page 4: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 4 – April 21, 2023

Phoenix Experiment - Results

• Forecaster still did have significant ability to forecast better than models in many circumstances – but which?

• Previous attempts to determine where the forecaster should concentrate were found to be simplistic

• Greatest strengths stem from analysis, diagnosis, prognosis, situational awareness

Page 5: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 5 – April 21, 2023

A Few Particular & Interesting Results

• When forecasters were ‘right’, they tended to ‘hedge’

• Whole lot of tweaking going on, very little successful

• Junior forecasters often perform better than experienced ones – why?

• Some parameters/situations best left alone

• Some model weaknesses identified

Page 6: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 6 – April 21, 2023

Phoenix - Conclusions

• Forecaster needs to know on a parameter-by-parameter basis when to intervene

• Operational routines must stress:– ADP prior to consultation of the models– Constant situational awareness

• Real-time verification more critical than previously

• Optimum Human/Machine mix remains unclear – no clear division of roles

Page 7: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 7 – April 21, 2023

Notes Regarding Severe Weather & Warning Situations

• Role of the forecaster in times of extreme weather remains more traditional– NWP continues to fail to capture extreme

events – forecaster intervention often required

– Forecaster interpretation of NWP results strongly required

– ADP even more critical– SA & rapid response remain essential

Page 8: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 8 – April 21, 2023

Role of the Forecaster

• Choosing best model?

• Short range only?

• High impact weather only?

• Interpretation of models?

• Consultation?

• Integration of ensemble prediction results?

• Tuning of automatic forecasting algorithms?

Page 9: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 9 – April 21, 2023

Phoenix – The Training Program

• Experiment converted into a training simulator

• Combatting “Meteorological Cancer”

• Forecasters were employing an operational routine which effectively by-passed their strengths

• Return to ADP/SA results in improved forecaster performance

Page 10: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 10 – April 21, 2023

Steps Taken

• 2007/2008 – All MSC forecasters given a week of Phoenix simulator training

• Phoenix simulator training given to all new forecasters

• Offices continuing to give staff Phoenix simulator training periodically

• Simulator extended into real-time by automation of the scoring system

Page 11: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 11 – April 21, 2023

Phoenix Scoring System

• Error score, essentially forecast-observed

• Error scores are normalized between parameters

• Scores for different time periods, parameters, locations can be weighted according to relative importance

• Relative importance determined from user surveys

• Parallels forecaster’s intention (not a ‘proper’ score)

• Scores are rolled up for summary and can be drilled-down to discover ‘root-causes’

• Output available in xml for greater analysis

Page 12: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 12 – April 21, 2023

Example Output Scores

• 23_SITES_AM_ISSUE_2008-08-24scores_score.xls

• Phoenix Monthly Report.xls

Page 13: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 13 – April 21, 2023

Automated Phoenix Scoring

• Output generated in near-real time for 2 dozen stations daily

• User interface for generating scores for any given forecast

• Possibility of doing studies seasonally, situationally-dependent, individualized

• Option of configuring scores for different users

Page 14: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 14 – April 21, 2023

Other Uses of Phoenix

• Simulations have been run using the methodology to investigate: Severe weather forecasting, Aviation, Extended range, Marine

• Results & conclusions generally the same, with different levels of intervention required

• Other types of simulators

• Research Validation

Page 15: Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 15 – April 21, 2023

What’s Next

• Completion of the automation of the full range or parameters, expansion to other forecasts

• Develop Weather Event Simulator capability using Ninjo, with Phoenix for evaluation

• Evaluation of training• Use in QA for ISO certification• Extended to more parameters and forecast

types in real time• More comprehensive use in operations

management, identification of needed research, assessment of training needs