93
Desktop Weather-Forecast System Jim Kinter, Brian Doty, Eric Altshuler, Mike Fennessy Center for Ocean-Land-Atmosphere Studies Calverton, Maryland USA

Desktop Weather-Forecast System Jim Kinter, Brian Doty, Eric Altshuler, Mike Fennessy Center for Ocean-Land-Atmosphere Studies Calverton, Maryland USA

Embed Size (px)

Citation preview

Desktop Weather-Forecast System

Jim Kinter, Brian Doty,

Eric Altshuler, Mike Fennessy

Center for Ocean-Land-Atmosphere Studies

Calverton, Maryland USA

Desktop Weather-Forecast System

ProblemThree barriers to using current operational technology:

1. Global data sets are very large in volume and granularity Large communications bandwidth is required to obtain them

2. Data assimilation and global numerical weather prediction (NWP) models are very large calculations

Supercomputers are required to run them3. Experts in NWP and supercomputing run global and regional models

Cadre of NWP and supercomputer experts must be trained

Opportunity

• Global weather forecasts are freely available from US National Weather Service (NWS) - data is provided at 1º and 0.5º resolutions

• High-resolution (20 km mesh or finer) regional weather forecasts can be produced with mesoscale models, nested in global forecasts

Desktop Weather-Forecast System

Combines

• Distributed data access and analysis technology (GrADS Data Server)• US NWS regional weather prediction model (workstation Eta)

Proposed Solution

Desktop Weather-Forecast System (DWS)

Desktop Weather-Forecast System

NCEP

Global Weather Forecasts

NWSNWSGlobal Weather

ForecastsGlobal Weather

Forecasts

COLACOLAGrADS Data

ServerGrADS Data

Server

Region-SpecificICs & Lateral BCs

WWWWWW

Regional NWP Model

Enables real-time weather forecasts through:

Easy access to real-time initial and boundary conditions data

Easy configuration of regional NWP model for any locality in the world

Readily available, low-cost personal computing hardware/software

Moderate speed Internet connection

Desktop Weather-Forecast System

Advantages:

• Runs on low-cost computers such as PCs (Linux); also runs on UNIX workstations; capable of utilizing multiple processors via MPI

• Includes easy to use scripts for selecting domain, running a forecast and displaying model output

• Provides a unique capability to: View local or regional weather observations Create locally-relevant custom displays Make a regional weather forecast Graphically merge local weather observations and

analysis/forecast products

Desktop Weather-Forecast System

Ease of use:

Point and click interface to set up region the first time Automated access to US NWS global NWP model

output for Atmospheric initial conditions Initial soil wetness, snow depth Surface boundary conditions (sea surface

temperature, sea ice) and lateral boundary conditions (atmospheric circulation)

Automated linkage to powerful desktop display program (GrADS) to visualize results of forecast

Full documentation for getting data, running forecast

Desktop Weather-Forecast System

Computer specifications and features

• Dell Precision Workstation 470n• Dual Intel Xeon processors, 3.40GHz,

2MB L2 cache per processor• 6GB total memory• 500GB total disk space• Red Hat Enterprise Linux ES 4 • Intel Fortran Compiler 9.0 for Linux• Intel MPI Library 2.0

DWS Installation: Preliminary steps

• README – brief instructions for setting up and running DWS. Read this first!

• topo – directory containing global 30” topography data• wseta.tar.Z – DWS package (compressed tar file)

Working directory

README topo wseta.tar.Z

DWS Installation: Preliminary steps

• Uncompress and untar wseta.tar.Z• After untarring, remove wseta.tar• Change to “topo” directory• Identify the set of compressed tar files that is sufficient to cover your

domain of interest. The files are named using a convention that is described on the next slide.

• Copy this set of files to ../worketa_all/eta/static/topo and change to this directory

• Uncompress and untar all of the .tar.Z files• After untarring, remove the tar files to save disk space• The files named U* contain the raw 30” topography data• For the whole world, the uncompressed topo data requires about

1.8 GB of disk space; if disk space is not an issue, just copy all of the .tar.Z files rather than trying to figure out which ones you need to cover your domain

DWS Installation: Preliminary steps

• The compressed tar files containing topography data are named as follows:

• The first three characters of the file name identify the latitude of the southern boundary of the sector (each sector covers 60 deg longitude by 30 deg latitude)

• The fifth character of the file name (a number from 1 to 6) indicates the sector’s longitudinal position

• Example: 00N_2.tar.Z contains data covering 00N-30N and 120W-60W (El Salvador is located in this sector)

• An image file showing a world map with sector boundaries overlaid (rev_topo_plot_crop.gif) is provided in the directory containing the compressed topography data

• The next slide graphically illustrates this file naming scheme

Areal coverage and naming of topography data files

1 2 3 4 5 6

60N

90N

30N

0

30S

60S

90S

180 120W 60W 0 60E 120E 180

DWS Installation: Preliminary steps

• There are a few additional settings in the “build” and “runeta” scripts, and elsewhere, which may need to be changed under certain circumstances. These will be covered later when the scripts are discussed in detail.

DWS files and directories: Important top-level files and directories

worketa_all

build eta

copygb libraries

data post_new

doc quilt

dprep runeta

Top-level directory

Eta model codes and scripts

Libraries used by various codes

Post-processor

Intermediate post-processing code

Main DWS run script

Main DWS build script

Code to interpolate model output to a different horizontal grid

Directory for storing initial and lateral boundary input data

Documentation directory

Data acquisition and pre-processing codes and scripts

DWS files and directories: two important scripts

• Top-level directory is “worketa_all”• “build” script sets model configuration (domain,

resolution, etc.) and compiles the model codes• “runeta” script sets run-time variables (date,

forecast length, etc.) and runs the model

worketa_all

build runeta

DWS Documentation

• README – brief instructions for setting up and running DWS. Read this first!

• worketanh_nwsoil.pdf, worketanh_nwsoil.ps – Original documentation for the workstation Eta model, written by Matt Pyle of NCEP (2002)

doc

README worketanh_nwsoil.pdf worketanh_nwsoil.ps

The “eta” directory: Important subdirectories

• bin – contains fixed-field data and pre-processor scripts

• exe – contains compiled executables• install – contains scripts that are called by the

main build script during the compilation process

eta

bin exe install runs src static util

topo

The “eta” directory: Important subdirectories

• runs – the model and post-processor run in this directory; contains all necessary pre-processed input data, as well as model output data after a run is finished

• src – contains model source codes

eta

bin exe install runs src static util

topo

The “eta” directory: Important subdirectories

• static – contains a subdirectory “topo” where 30” topography data is stored

• util – contains scripts and codes that implement the DWS graphical user interface (GUI) and perform other miscellaneous tasks

eta

bin exe install runs src static util

topo

The DWS Graphical User Interface (GUI)

• The DWS GUI consists of three parts• The first part is invoked by the “build” script and enables the user to

specify the model domain, resolution and several other parameters using a simple point-and-click interface

• The second part is invoked by the “runeta” script and enables the user to specify several run-time variables, such as the date, forecast length and cumulus convection parameterization

• The third part is invoked directly from GrADS and enables the user to view 13 different model output fields, for all available vertical levels and forecast times. It is also capable of animation, zooming, and saving plots to a file.

• All three parts of the DWS GUI are optional; the user may choose to disable the GUI in the “build” and/or “runeta” scripts, and instead specify the model parameters directly in the scripts. This is useful for more experienced users and for those who wish to run DWS in batch mode (e.g. in a cron job).

The DWS GUI, Part I: Domain selection and resolution specification

• Execute the “build” script to begin the process of configuring and compiling DWS for your domain

• For a given domain and resolution, this script only needs to be run once

• After the user has completed all the steps in the GUI, control returns to the “build” script and the codes are then compiled

• The next 12 slides give a step-by-step demonstration of how the GUI (Part I) works

Based on the location of your domain of interest, choose whether the world map is centered on Greenwich or the International Date Line

We have chosen a world map centered on 0 degrees (Greenwich)

We have not selected the actual domain yet; we have just zoomed in on the overall region of interest. From this screen, the approximate domain is chosen.

Choose 1.0 deg – the 0.5 deg data is not currently available, though it may be in the future

This is the exact domain boundary – note that the sides are curved and therefore they do not have constant latitude or longitude

If you wish to start over again and select a different domain, click “No”

Always choose “Eta” for DWS applications

NCEP recommends using 25 hPa

Always choose “GFS from GDS”. The other option exists only for specialized applications.

This screen summarizes your choices. At this point, the GUI is finished and after you click on “Continue” the model will be compiled.

The DWS GUI, Part II: Running the model

• Execute the “runeta” script to start a DWS forecast or historical simulation

• After the user has completed all the steps in the GUI, control returns to the “runeta” script

• The “runeta” script then fetches initial and lateral boundary data and runs the pre-processor, forecast model, and post-processor

• The next 13 slides give a step-by-step demonstration of how the GUI (Part II) works

In this demonstration, we have chosen to run a historical simulation

Screens for specifying the initial date do not appear in the case of a real-time forecast

In this demonstration, we have chosen the initialization date to be 17 May 2006. The year must be between 1901 and 2099.

Select 00 UTC or 12 UTC – although GFS is run operationally 4 times per day, the COLA GDS only has data from the 00 and 12 UTC runs

Choose 6 h – this is the interval at which GFS forecast data is available on the COLA GDS

Choose “lat-lon” – the GUI for viewing DWS output requires the data to be on a lat-lon grid

Select “Hydrostatic” – nonhydrostatic dynamics is an experimental feature

This screen summarizes your choices. At this point, the GUI is finished and after you click on “Continue” the model run will start.

The DWS GUI, Part III: Visualizing model output

• After a DWS run has finished:• Go to the worketa_all/eta/util directory• Start the data visualization program GrADS by entering

the command “gradsc”• At the GrADS prompt, run the DWS output visualization

GUI (a GrADS script) by entering the command “view_gui”

• As the user performs different tasks with the GUI, useful status information is displayed in the GrADS command window

• The next 18 slides give a step-by-step demonstration of how the GUI (Part III) works

The screen looks like this before any fields have been displayed

Click on “Field” to access a drop-menu showing the 13 available fields, then click on the field you wish to display

After clicking on “SLP” the initial sea-level pressure is displayed

Click on “Fcst Hour” to access a drop-menu showing the available forecast hours, then click on the hour you wish to display for this field

After clicking on forecast hour 6, the 6-hour forecast for SLP is displayed

We have changed the field to “Precip” and the 6-hour precipitation forecast is displayed. Note that the forecast hour has not changed.

We have changed the field to “Wind” and the level to 850 so the 850-hPa wind vectors and speed (shaded contours) are displayed

Click on “Level” to access a drop-menu showing the available pressure levels, then click on the level you wish to display for this field

After clicking on level 200, the 200-hPa wind vectors and speed (shaded contours) are displayed

We have changed the field to “2-m Tmp” and the 6-hour forecast for 2-meter temperature is displayed

Click on “Zoom” to access the zoom feature, then select the region you wish to zoom in on using the mouse

We have zoomed in on El Salvador and the 6-hour forecast for 2-meter temperature is displayed

We have changed the field to “U Wind” and the 200-hPa zonal wind is displayed. We also returned to the original domain by clicking on “Unzoom”.

We have changed the field to “Prc Wat” and the 6-hour forecast for precipitable water is displayed

We have changed the field to “Rel Hum”, the level to 700 and the forecast hour to 12, and the 12-hour forecast for 700-hPa relative humidity is displayed

Click on “Animate” to cycle through all forecast hours for this field and level. There is a one-second pause between frames.

Click on “Print” to save the current screen display to a GrADS metafile. This metafile can be processed by other GrADS utilities and printed.

Click on “Quit” to exit from the GUI, then type “quit” at the GrADS prompt to quit GrADS

Using the “build” script: Overview

• This script handles domain selection and compilation of the model codes

• It needs to be run only once for a given model domain

• The user may choose to use the DWS GUI or specify the model domain parameters directly in the script

• There are several variables that must be set regardless of whether or not the GUI is used

Using the “build” script: General script variables

• mach: machine type, can be set to DEC, LINUX, SGI, IBM or HP [LINUX]

• fc: name of Fortran 90 compiler [ifort]• gui: yes or no – yes: use DWS GUI (Part I), no: do not

use GUI; set parameters directly in script• GRADS: full path name of GrADS executable

[/usr/local/grads/bin/gradsdods]• GADDIR: directory containing data needed by GrADS

[/usr/local/grads/lib]• The machine type, compiler name and GrADS variables

have already been set to appropriate values, so the user need only be concerned with the variable “gui”

Using the “build” script: Model domain parameters

• If you are using the GUI (gui=yes), the parameter settings listed on this slide and the next two slides are ignored!

• res: horizontal resolution of model (km). Recommended values are 80, 48, 32, 22, 15, or 10. [22]

• lonc: longitude (deg) of domain center, longitude increases eastward [-89.0]

• latc: latitude (deg) of domain center [14.0]• wlon: zonal width of domain (deg) along boundary closest to equator

[20.0]• wlat: meridional width of domain (deg) along central meridian [13.3]• Note: wlon and wlat only provide an approximate size for the

domain. The exact domain, which is calculated using formulas from spherical trigonometry, will not, in general, be a rectangle in lat-lon space, but it will contain the rectangle specified by the user.

Using the “build” script: Model domain parameters (continued)

• dlam: model grid resolution in longitude (deg) [0.1540698]• dphi: model grid resolution in latitude (deg) [0.1408451]• The table below shows recommended values for resolution-

dependent parameters. These values are the ones selected automatically when the GUI is used and the user chooses a horizontal resolution. The variables dt and nphs are discussed later in the section describing the “runeta” script.

res (km) dlam (deg) dphi (deg) dt (s) nphs

80 0.5769231 0.5384615 200 4

48 0.3333333 0.3076923 120 4

32 0.2222222 0.2051282 90 6

22 0.1540698 0.1408451 60 10

15 0.0967742 0.0952381 30 20

10 0.0666667 0.0657895 20 30

Using the “build” script: Model domain parameters (continued)

• dres: horizontal resolution of input data (deg). At present, leave this set to 1.0 [1.0]

• vc: vertical coordinate – 1 for eta, 2 for sigma. Do not change this setting. [1]

• lm: number of vertical levels in model. The value must be 38, 45, 50 or 60. [50]

• pt: pressure (hPa) at model top. The value must be 25 or 50 (25 is recommended). [25]

• conf: do not change this setting [gds]

Using the “build” script: Miscellaneous parameters

• LSM: number of pressure levels specified in the namelist FCSTDATA in the “runeta” script. If you want to get post-processed output on more pressure levels, LSM may be increased. [21]

• INPES: number of processing elements (PEs) in the I-dimension [1]

• JNPES: number of processing elements (PEs) in the J-dimension [2]

• The product of INPES and JNPES must not exceed the number of CPUs on the machine, which in our case is 2. It is best to leave these settings alone.

• Further down in the script are settings for various compiler options. These have been set properly for this machine and therefore do not need to be changed.

Using the “build” script: Compilation

• Execute the “build” script by typing “build”. If gui=yes, the GUI will guide the user through the process of selecting the model domain and other parameters.

• After the GUI is finished, or if it is not used, compilation typically takes a few minutes

• If the user wishes to change any of the model domain parameters or miscellaneous parameters, the “build” script must be run again after making the changes

How to fix a known problem

• For domains with high resolution and/or large size, it may be necessary to change a setting in three files:

• worketa_all/eta/src/etafcst_all/parmbuf• worketa_all/eta/src/model_with_kf/parmbuf• worketa_all/quilt/parmbuf• The variable “IBUFMAX” is currently set to 25000000. This should

be sufficient for most applications but it may be too small in some cases. If it is too small, the model and/or post-processor will fail. Determining the minimum required value is somewhat complicated, so the user may choose to do it by trial and error. It should not be set to an excessively large value, however, because large values significantly increase the memory required to run the model.

• After increasing the value of IBUFMAX in all three files, you must re-run the “build” script

Using the “runeta” script: Overview

• This script sets the runtime variables (date, forecast length, etc.) and runs the scripts and codes that fetch initial and lateral boundary data, the pre-processing codes, the forecast model, and the post-processing codes

• It needs to be run each time a DWS forecast is to be performed

• The user may choose to use the DWS GUI or specify the runtime variables directly in the script. The “runeta” script may also be run in batch mode, without the GUI, e.g. in a cron job.

• There are several variables that must be set regardless of whether or not the GUI is used

Using the “runeta” script: General script variables

• root: full path name of DWS top level directory up to and including worketa_all [/home/ela/worketa_all]

• gui: yes or no – yes: use DWS GUI (Part II), no: do not use GUI; set parameters directly in script

• GRADS: full path name of GrADS executable [/usr/local/grads/bin/gradsdods]

• GADDIR: directory containing data needed by GrADS [/usr/local/grads/lib]

• gmap: full path name of gribmap executable. gribmap is a GrADS utility for generating the index file for GRIB data. [/usr/local/grads/bin/gribmap]

• The GrADS variables have already been set to appropriate values, so the user need only be concerned with the variables “root” and “gui”

Using the “runeta” script: Runtime variables

• If you are using the GUI (gui=yes), the variable settings listed on this slide and the next slide are ignored!

• mode: real for real-time forecast, hist for historical simulation• ymd: initialization date, in YYYYMMDD form, either specified (for historical

runs) or set to the current date based on Coordinated Universal Time (UTC) (for real-time runs)

• hh: hour (UTC) of initialization (currently, must be 00 or 12)• fhend: forecast length (h). Must be 256 or less; larger values will require

code changes. The model has not been extensively tested beyond 96 hours. [72]

• fhint: interval to get lateral boundary data (h). Leave as is for now. [6]• outfrq: model output interval (h). Recommended values are 1, 2, 3 or 6. [6]• outgrd: model output grid: latlon for lat-lon, lmbc for Lambert conformal. The

output visualization GUI requires that outgrd be set to latlon. [latlon]

Using the “runeta” script: Runtime variables (continued)

• conv: cumulus convection scheme: 1 for Betts-Miller, 2 for Kain-Fritsch. Choice may depend on model performance for the particular domain of interest. [2]

• dyn: model dynamics: 1 for hydrostatic, 2 for nonhydrostatic. Nonhydrostatic option is experimental and untested, but hydrostatic approximation may be invalid for resolutions finer than about 10-12 km. [1]

• dt: model time step (s). See table on slide 70 for recommended values depending on resolution. [60]

• nphs: call model physics every nphs time steps. nphs must be an even integer; product of dt and nphs should be as close to 600 as possible. See table on slide 70 for recommended values depending on resolution. [10]

Using the “runeta” script: Data directories

• ddir: full path name of directory where DWS post-processed output will be stored. It is a good idea to include the initialization date and hour in this directory name, as has already been done in the script.

• sdir: full path name of directory containing initial surface boundary data. For real-time forecasts, data is obtained from NCEP and saved here. For historical runs, sdir must contain appropriate data files. (I maintain a continuing archive of this data at COLA.) It is a good idea to include the initialization date and hour in this directory name, as has already been done in the script.

• gdir: not relevant for most DWS applications

Using the “runeta” script: Namelists

• There are two Fortran namelists in which runtime variables are specified, including those variables discussed previously as well as several additional variables discussed below

• The ETAIN namelist: This namelist is used by the pre-processor. Most variables here have already been given values in the script or GUI and are automatically assigned the same values in the namelist. Of the remaining variables, none are likely to require changing, but the user is referred to Matt Pyle’s documentation (pp. 6-7) for further information.

Using the “runeta” script: Namelists (continued)

• The FCSTDATA namelist: This namelist is used by the forecast model and post-processor. As with the ETAIN namelist, many variables here are automatically assigned the same values that have been specified or calculated in the script. Of the remaining variables, the following are the most important for the user:

• SPL: beginning on line 167 of the “runeta” script, specify the pressure levels on which post-processed model output may be requested. The number of entries in the SPL array must be equal to the value of LSM set in the “build” script (see slide 72). The values are in Pa (not hPa!) and must be given in order of increasing pressure. Note that post-processed output will not necessarily be written to all of these pressure levels; the user may request output on a subset of the levels in the SPL array (this will be discussed later in the section on using the “cntrl.parm” file to control post-processor output).

Using the “runeta” script: Namelists (continued)

• TPREC, THEAT, TCLOD, TRDSW, TRDLW, TSRFC: these variables specify the number of hours over which various fluxes are accumulated or averaged before the totals or averages are reset to zero. They should be set considering the frequency at which model output is generated (i.e. the value of outfrq; see slide 77). For example, resetting precipitation totals every 2 hours would not make sense if model output is requested every 3 hours. These variables may be given different values, but normally they are all equal. It may be advisable to set them to be the same as outfrq.

• TPREC: precipitation• THEAT: average latent heating associated with precipitation• TCLOD: average cloud fractions• TRDSW: shortwave radiation fluxes• TRDLW: longwave radiation fluxes• TSRFC: surface fluxes (e.g. average sensible heat flux)

Using the “runeta” script: Running a forecast

• Execute the “runeta” script by typing “runeta”. If gui=yes, the GUI will guide the user through the process of selecting the runtime variables.

• After the GUI is finished, or if it is not used, the script will fetch input data, and then the pre-processor, model, and post-processor will run. Model output data and print output will be placed in $ddir (see slide 79) after the run has finished. Execution time will vary according to domain size and resolution, forecast length, and network bandwidth for transfer of input data from the COLA GDS. For the domain and resolution settings that I have provided in the scripts, a 72-h forecast requires about 30 minutes to run, excluding the time required for input data acquisition, which depends on network conditions and may vary significantly from one run to another.

• To visualize model output, use the DWS GUI (Part III; see slides 48-66) or any other utility that can read and display GRIB data

Using the “runeta” script: Automating DWS forecasts in batch jobs

• Using the UNIX crontab facility, it is possible to automatically run DWS forecasts at specified times

• When run this way, the “runeta” script is executed as a batch job; therefore, the DWS GUI must be disabled (set gui=no)

• crontab uses a file containing one line for each scheduled job

• Each line in the crontab file has six fields separated by spaces. The fields specify when the job is to be run and the script or program that runs the job.

Using the “runeta” script: Automating DWS forecasts in batch jobs (continued)

• The format of each line is as follows:• mm hh dd MM DD command• mm: minute (0 to 59)• hh: hour (0 to 23) expressed in terms of local time (i.e.

the system time on the machine that is running the job)• dd: day of month (1 to 28-31, depending on month)• MM: month of year (1 to 12)• DD: day of week (0 to 6 for Sunday to Saturday)• command: Full path name of shell script or executable

program that runs the job

Using the “runeta” script: Automating DWS forecasts in batch jobs (continued)

• The first five fields (mm, hh, dd, MM, DD) can be specified in any one of the following ways:

• an integer (to specify a single value)• two integers separated by a hyphen (to specify an inclusive range)• a list of integers separated by commas (to specify a set of single values)• an asterisk (to specify all possible values)• The specification of date and time using these five fields is not unique (there

is redundancy between dd and DD). Be careful not to specify a set of fields that is inconsistent or invalid (e.g. specifying dd as 1-31 and MM as 1-12 or * would cause problems in months that do not have 31 days; it is better to specify dd as *).

• To submit the crontab file, enter “crontab crontab-file”• To remove the crontab entry, enter “crontab –r”. The jobs specified in the

crontab file will no longer be scheduled.• See the crontab man page for more detailed information on using crontab

Using the “runeta” script: Automating DWS forecasts in batch jobs (continued)

• As an example of how to run DWS in batch mode using crontab, I have provided a sample crontab file, /home/ela/wseta/cronjobs, which contains two lines:

• 0 23 * 1-12 0-6 /home/ela/worketa_all/runeta00• 0 11 * 1-12 0-6 /home/ela/worketa_all/runeta12• The first line schedules a job to run at 2300 local time every day, in

all months, using the script “runeta00” in /home/ela/worketa_all. This script runs a real-time DWS forecast initialized at 00 UTC.

• The second line schedules a job to run at 1100 local time every day, in all months, using the script “runeta12” in /home/ela/worketa_all. This script runs a real-time DWS forecast initialized at 12 UTC.

• To submit these two jobs to crontab, enter “crontab cronjobs”• Using this method, DWS forecasts will automatically run twice daily

Data availability on the COLA GDS and how it affects real-time DWS forecasts

• Data from the 00 UTC GFS run is usually available on the COLA GDS by 0500 UTC. Occasionally, it is not available until later.

• Data from the 12 UTC GFS run is usually available on the COLA GDS by 1700 UTC. Occasionally, it is not available until later.

• The 5-hr delay is a result of the time required for GFS to run at NCEP and also the time required to transfer the data from NCEP to COLA. It is unlikely that we can reduce this delay much further.

• Before running a real-time DWS forecast, whether interactively or in batch mode, note the current time and make sure you are not trying to run the forecast too early. For example, if it is 1500 UTC and you start a real-time DWS forecast initialized at 12 UTC, it will fail because the required GFS data will not be available on the COLA GDS yet. If you want to run a real-time DWS forecast now, initialize it for 00 UTC instead.

• There will also be problems if you try to run a real-time DWS forecast when the desired initialization date is different from the current date (expressed in terms of UTC). For example, if it is currently 0000 UTC (or later) on 15 March, you will not be able to run a real-time DWS forecast initialized at 12 UTC (or any other time) on 14 March. In such a case, the forecast must be run as a historical simulation (see slides 77 and 79).

Model instability and how to correct for it

• Very occasionally, a DWS run will fail because of instability in the model. Such instability generally has two possible causes:

• corrupted input data• input data is OK but the meteorological situation over the

domain is unusual or extreme (e.g. a very strong jet steam or intense convection)

• If the input data is bad, it is usually impossible to run the model beyond the time at which the bad data begins to influence the model processes. However, if the input data is OK, a failed run can often be successfully re-run after shortening the time step.

Model instability and how to correct for it (continued)

• The table on slide 70 shows recommended values for the model time step (dt) depending on resolution. In most cases, these values are sufficiently small to ensure stability.

• In case of instability, finding a value of dt which allows a successful run is a process of trial and error

• If dt is changed, the new value must divide evenly into 3600• The value of nphs must also be changed so that the product of dt

and nphs is as close to 600 as possible (see slide 78)• After changing dt and nphs, repeat the run• Example: Assume the resolution is 22 km, dt=60 and nphs=10.

Suppose a run fails. One might reduce dt to 50, set nphs=12 and repeat the run. If the run still fails, dt can be reduced further, but the execution time will increase significantly and the additional time and effort may not be justified. There is also the possibility that the input data is corrupted, in which case a successful run is unlikely no matter how small the value of dt is.

Using the “cntrl.parm” file to control post-processor output

• The “cntrl.parm” file is located in worketa_all/eta/runs• It consists of a header followed by many pairs of lines, with each pair

corresponding to a single output field• The first line in each pair specifies the name of the field and several

parameters which are not of major concern to the user• The second line in each pair looks similar to this:• L=(01010 10101 01010 01010 10000 00000 00000 00000 00000 00000 …• Each 0 or 1 corresponds to a pressure level in the array SPL that is

specified in the FCSTDATA namelist (see slide 81). The leftmost element corresponds to the first entry in SPL, i.e. the pressure level that is highest above the surface, and successive elements correspond to subsequent entries in SPL in the same order. Since there are exactly LSM pressure levels in SPL, only the first LSM elements in the string are relevant; the rest should be set to 0.

• For a given field, post-processed output is written on each pressure level for which the corresponding string element is set to 1; if the element is 0, no data is written for that pressure level

• For surface fields, the first string element determines if the field is written

Using the “cntrl.parm” file to control post-processor output (continued)

• Suppose that LSM=21 and SPL is specified as follows:• SPL=5000.,10000.,15000.,20000.,25000.,30000.,35000.,40000.,

45000.,50000.,55000.,60000.,65000.,70000.,75000.,80000.,85000.,90000.,92500.,95000.,100000.

• If the control string for a given field in the “cntrl.parm” file is:• L=(01010 10101 01010 01010 10000 00000 00000 00000 00000 …• the field will be output only on the following pressure levels (in hPa):

100, 200, 300, 400, 500, 600, 700, 850, 925, 1000• Do not delete any lines in “cntrl.parm” or modify any lines except

those beginning with “L=“. To remove a field from the output, set all elements in its control string to 0.

• More detailed information about the “cntrl.parm” file can be found in Matt Pyle’s documentation (pp. 11-12)

The interface between GrADS and DWS output data

• If the user has changed his/her choice of output fields and/or output pressure levels, changes are also necessary in the “runeta” script and the file worketa_all/eta/util/vars

• Line 201 in the “runeta” script tells GrADS how many pressure levels are in the DWS output and what the pressure values are

• The format of this line is as follows:• zdef NLEVS levels LEV1 LEV2 LEV3 …• Important: the levels are specified in order of decreasing pressure (opposite

to the order used in specifying pressure levels in the SPL array)• For the example shown in the previous slide, the line would be:• zdef 10 levels 1000 925 850 700 600 500 400 300 200 100• The file worketa_all/eta/util/vars tells GrADS how many fields are in a

dataset and what those fields are• Correctly modifying the “vars” file requires considerable expertise in working

with GRIB data in GrADS and is beyond the scope of this presentation. If a user wishes to change his/her choice of output fields and/or output pressure levels, I can easily create a customized “vars” file.