30
Validation and Comparison between Validation and Comparison between WAsP and Meteodyn Predictions for a Project in Complex Terrain a Project in Complex Terrain Meteodyn WT users meeting– Paris (France), March 21 and 22, 2011 Gilles Boesch, Wind Project Analyst Salim Chemanedji, Senior Project Manager Martin Hamel, Project Manager Hatch (Montreal), Canada

Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Embed Size (px)

Citation preview

Page 1: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Validation and Comparison betweenValidation and Comparison between WAsP and Meteodyn Predictions for

a Project in Complex Terraina Project in Complex Terrain

Meteodyn WT users meeting– Paris (France), March 21 and 22, 2011

Gilles Boesch, Wind Project Analystj y

Salim Chemanedji, Senior Project Manager

Martin Hamel, Project ManagerHatch (Montreal), Canada

Page 2: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

HatchHatch• Employee-owned, Projects in more than 150 countries

8000 l ld id– 8000 employees worldwide– EPCM, integrated teams, project and construction

managementConsulting process technologies and business– Consulting – process, technologies and business

– Serving mining & metals, infrastructure and energy

• For Wind Power projects:– Wind resource assessment– Geotech engineering, foundation design– Turbine evaluation and selection– Total project and construction management– Interconnection assessment, Electrical engineering– Environmental assessment

2

Page 3: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

OverviewOverview

• Review of models• Review of models• Presentation of a test case• Results and comparisonsResults and comparisons• Conclusions and investigations

3

Page 4: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Review of models

4

Page 5: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Why is CFD a good alternative to linear models ?• CFD is now well recognized by the wind• CFD is now well recognized by the wind

community• Overpass linear model limitations for p

complex terrain• Reduce the modelling uncertainty

R d fi i l i k• Reduce financial risks

But CFD must be used with care since itBut CFD must be used with care since it is more complex

5

Page 6: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Why is CFD a good alternative to linear models ?• Some questions remains:• Some questions remains:

– Can we quantify the uncertainty and errors associated to these models ?

– What are the criteria for chosing linear or CFD models ?

– Do CFD models always perform better than linear models ?

Usually difficult to assess because only few meteorological masts are available within a projectmeteorological masts are available within a project

to perform cross-predictions

6

Page 7: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Why is CFD a good lt ti t li d l ?alternative to linear models ?

CFD Models (Meteodyn) Linear Models (WAsP)CFD Models (Meteodyn)• Pros

– Suitable for complex terrainC lib ti f th it

Linear Models (WAsP)• Pros

– Easy and fast computationGood performance in– Calibration of the site

possible (forest, stability, mesh etc.)

– Built-in features (energy, extreme winds turbulence

– Good performance in relatively flat terrain

– Is already a standard

• Consextreme winds, turbulence etc.)

• Cons– Solid expertise needed

C l l ti ti

– High errors for complex terrain

– Calibration is difficult to perform (when possible)

– Calculation time

7

Page 8: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Presentation of a test case

8

Page 9: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

A test caseA test case

• Comparison between WAsP and• Comparison between WAsP and Meteodyn on a potential project

• Project covers an area of 11km x 8kmj• Equipped with 12 meteorological masts

(recording from 6 months to 6 years of data)data)

• Relatively complex (deep valleys, ridges, rolling mountains)g )

• Mix of coastal and inland areas

9

Page 10: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

A test caseA test case

• Forest diversity varies among :• Forest diversity, varies among :– Completely logged area (no trees)– 15m high trees– Regrowth

• RIX variations (Ruggedness Index)% of slopes >30% in a 3500m radius– % of slopes >30% in a 3500m radius

– 2 to 25 over the entire project– 2.7 to 22.4 at the meteorological masts

Variety of conditions to evaluate the behavior of the models

10

Page 11: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

A test caseA test case

Masts Altitude (m) RIX (%)(m)

M1 540 10.1M2 560 11.0M3 421 22 4M3 421 22.4M4 420 17.9M5 448 15.1M6 521 16 6M6 521 16.6M7 560 8.0M8 433 22.1M9 440 11 8M9 440 11.8M10 665 14.3M11 567 2.7M12 540 12 1M12 540 12.1

11

Page 12: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

A test caseA test case

12

Page 13: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

A test caseA test case

13

Page 14: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Meteodyn settingsMeteodyn settings

• Topographical information :• Topographical information :– Roughness : 0.6 for trees– Elevation Contour : 5m within project area

• Mesh :– Mapping area covering all met masts

Mesh independency tests (variation of the– Mesh independency tests (variation of the Radius)

– Minimum horizontal resolution : 30m Minim m ertical resol tion 5m– Minimum vertical resolution : 5m

– 3 460 000 cells in the prevailing direction

14

Page 15: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Meteodyn settingsMeteodyn settings

• Model:• Model:– Robust forest model (convergence issues with

the dissipative model)– Near neutral stability class– 30 degrees directional steps

• Data:Data:– Measured data– Quality controlled

At 50m or 60m high– At 50m or 60m high– Extrapolated to long term with standard MCP

method

15

Page 16: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results and comparisons

16

Page 17: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results – Wind SpeedsResults Wind Speeds

• Cross-Prediction Matrix– Predictors : Synthesis performed with the

« Predictor » mast– Predicted : Wind Speed at the « Predicted Sp

Mast »Predicted

M1 M2 M3 … M12M1 M2 M3 … M12 ic

tor

M1 M1 measured M1 predictsM2

M2 M2 predictsM1 M2 measured

M3 M3 measured

Pred

i M3

… …

M12 M12 measured

17

Page 18: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - ErrorsResults Errors

• Cross-Prediction Matrix– 12 x 12 matrix = 132 cross predictions– For both WAsP and Meteodyn– No correction is applied to both models output– No correction is applied to both models output– Correction often applied with WAsP because

of wind speed inconsistencies in complex terrainterrain

• Converted into a Relative Error Matrix :

measuredpredicted VVE

−%

• Resulting in 132 relative error values for h di ti

measured

measuredpredicted

VE =%

each cross-prediction

18

Page 19: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - ErrorsResults Errors

• Absolute errors• Absolute errorsWAsP Meteodyn

Min Error 0.0% 0.0%Max Error 34.0% 14.1%Average 7.1% 4.7%

• On average, Meteodyn reduces the error by 35%.S ti 33 t f 132• Some exceptions : 33 cases out of 132 show better results with WAsP

19

Page 20: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - ErrorsResults Errors• Generally, errors have the same sign

(positive/negative)

30.0%

40.0%

10.0%

20.0%

ve E

rror

(%)

WAsP

Meteodyn

-10.0%

0.0%Rel

ativ y

• The difference is in the magnitude-20.0%

g

20

Page 21: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - Errors

Masts Altitude (m) RIX (%)M1 540 10.1M2 560 11.0M3 421 22.4M4 420 17.9M5 448 15.1M6 521 16 6Results Errors

• Comparison at each mast

M6 521 16.6M7 560 8.0M8 433 22.1M9 440 11.8

M10 665 14.3M11 567 2.7M12 540 12.1

• Comparison at each mast

25.0%

Error comparison

15.0%

20.0%

Erro

r)

5.0%

10.0%

Ave

rage

E

WAsP

Meteodyn

0.0%M1 M2 M3 M4 M5 M6 M7 M8 M9 M10 M11 M12

Met Masts

21

Page 22: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - ErrorsResults Errors

• RIX dependency:– WAsP : Error increase sharply when RIX >

15%– Meteodyn : Error is more constant

20.00%

25.00%

RIX influence on cross-prediction errors

10.00%

15.00%

ge E

rror

(%)

WaspMeteodyn

0.00%

5.00%

0.0 5.0 10.0 15.0 20.0 25.0

Ave

rag

RIX (%)RIX (%)

22

Page 23: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - ErrorsResults Errors• RIX dependency:

Ri t ti WA P ith ∆RIX– RisØ suggests correcting WAsP with ∆RIX (between 2 masts)

– Correction based on a correlation between E d ∆RIX f h di tiError and ∆RIX for each cross-prediction

– Open question : Can we correct Meteodyn based on the RIX ?

y = 0.5552xR² = 0.6345

10 0%

20.0%

30.0%

40.0%

%)

Error vs dRIX - Meteodyn

y = 1.0632xR² = 0.7025

10 0%

20.0%

30.0%

40.0%

%)

Error vs dRIX - Wasp

-30.0%

-20.0%

-10.0%

0.0%

10.0%

‐30.0% ‐20.0% ‐10.0% 0.0% 10.0% 20.0% 30.0%Erro

r (%

-30.0%

-20.0%

-10.0%

0.0%

10.0%

‐30.0% ‐20.0% ‐10.0% 0.0% 10.0% 20.0% 30.0%

Erro

r (%

23

∆RIX (%) ∆RIX (%)

Page 24: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - ErrorsResults Errors• RIX dependency:

E i h ∆RIX i– Error increases when ∆RIX increases– Error and ∆RIX seem to be correlating– The slope is lower for Meteodyn

Meteodyn is less sensitive to site topography differences

y = 0.5552xR² = 0.6345

10 0%

20.0%

30.0%

40.0%

%)

Error vs dRIX - Meteodyn

y = 1.0632xR² = 0.7025

10 0%

20.0%

30.0%

40.0%

%)

Error vs dRIX - Wasp

-30.0%

-20.0%

-10.0%

0.0%

10.0%

‐30.0% ‐20.0% ‐10.0% 0.0% 10.0% 20.0% 30.0%Erro

r (%

-30.0%

-20.0%

-10.0%

0.0%

10.0%

‐30.0% ‐20.0% ‐10.0% 0.0% 10.0% 20.0% 30.0%

Erro

r (%

24

∆RIX (%) ∆RIX (%)

Page 25: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - UncertaintyResults Uncertainty

• 11 estimates of wind speed for each mast11 estimates of wind speed for each mast• Uncertainty is estimated with the standard

deviation of the errors

Masts UncertaintyWAsP

UncertaintyMeteodyn

UncertaintyReduction RIX (%)

M1 4 6% 2 4% 1 9 10 1M1 4.6% 2.4% 1.9 10.1

M2 4.4% 3.1% 1.4 11.0

M3 7.8% 3.1% 2.5 22.4

M4 4.2% 2.5% 1.7 17.9

M5 2.9% 2.7% 1.1 15.1

M6 2.8% 2.7% 1.0 16.66 2.8% 2.7% 1.0 6 6

M7 4.7% 3.5% 1.4 8.0

M8 5.7% 3.4% 1.7 22.1

M9 3.0% 2.3% 1.3 11.8

M10 4.2% 2.8% 1.5 14.3

M11 4.8% 3.1% 1.5 2.7

M12 3.4% 2.5% 1.4 12.1

25

Page 26: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Results - UncertaintyResults Uncertainty

Caracterize the repeatability of an• Caracterize the repeatability of an estimate

• Uncertainty can be reduced on average y gby 1.5 when using Meteodyn.

• No trend with regards to the RIX• The separation distance is more

important regarding the uncertainty

Numbers are site-specific and must be considered with care !

26

Page 27: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

Conclusions and investigationsinvestigations

27

Page 28: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

ConclusionsConclusions

• For this project Meteodyn shows better• For this project, Meteodyn shows better results for error and uncertainty compared to WAsP

• Significant advantages :– Cost reduction : Need for less meteorological

masts per projectmasts per project– Financial risks reduction : Lower uncertainty

increases P75 or P99 value

28

Page 29: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

ConclusionsConclusions

• However :• However :– WAsP results are without any correction which

is often performed (like RIX correction for example)example)

– Results are specific to this site– Some cases are better predicted with WAsP– Other projects with lower RIX show equivalent

results between both models

29

Page 30: Hatch Ltd. Validation and comparison WAsP and meteodyn 2011

ConclusionsConclusions• Further investigations and questions :

– How do they compare when correcting WAsP y gwith the RIX ?

– Can we correct Meteodyn’s results in a certain way ? (RIX or other)

– Why does WAsP better predict the wind speed at some points ?

• Mesh refinement ?• Forest model ?• Roughness ?

– What are the criteria for defining a terrain inWhat are the criteria for defining a terrain in terms of complexity (use of WAsP vs Meteodyn) ?

– How many met tower should we use inHow many met tower should we use in complex terrain ?

30