1 MCP: Pitfalls & Common Mistakes Dr Jeremy Bass & RES Wind Analysis Teams (UK & USA)...

Preview:

Citation preview

1

MCP: Pitfalls & Common Mistakes

Dr Jeremy Bass & RES Wind Analysis Teams (UK & USA)SENIOR TECHNICAL MANAGER

AWEA Wind Resource & Project Assessment Workshop30 September – 1 October 2009, Minneapolis, MN, USA

2

OVERVIEW – What Do You Need for MCP?

3

OVERVIEW – What Do You Need for MCP?

4

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS - HARDWARE

Need to avoid instrumentation issues, including:

• poor mast installation

• poor mounting of instruments (IEC; stub mounting)

• poor choice of instruments (anemometers, vanes etc)

• poor choice of data logger and/or configuration

• insufficient power!

• poor/lack of maintenance & record keeping

INSERTIMAGE

INSERTIMAGE

INSERTIMAGE

5

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS - HARDWARE

Need to avoid instrumentation issues, including:

• poor mast installation

• poor mounting of instruments (IEC; stub mounting)

• poor choice of instruments (anemometers, vanes etc)

• poor choice of data logger and/or configuration

• insufficient power!

• poor/lack of maintenance & record keeping

INSERTIMAGE

INSERTIMAGE

INSERTIMAGE

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 1

6

Bin Averaged Ice-Free Wind Speed Ratio (Unheated/Heated) on SWEhrnM178

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2

0 30 60 90 120 150 180 210 240 270 300 330 360

Wind Direction Bin (°N)

Win

dsp

eed

Rat

io (

Un

hea

ted

/Hea

ted

)

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2

7

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2

8

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2

9

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2

10

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2

11

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – QUALITY CONTROL - 2

12

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – UNDERSTANDING (HARD)

13

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – UNDERSTANDING (HARD)

14

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – UNDERSTANDING (HARD)

15

1. TARGET SITE DATA: PITFALLS & COMMON PROBLEMS – UNDERSTANDING (HARD)

16

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – BACKGROUND

• the fundamental principle of MCP is that site climatology, over a 20 – 25 life, is stationary, i.e. statistics consistent over time

• reference site data must be consistent with this requirement – essential!

17

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – BACKGROUND

• the fundamental principle of MCP is that site climatology, over a 20 – 25 life, is stationary, i.e. statistics consistent over time

• reference site data must be consistent with this requirement – essential!

18

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – BACKGROUND

• the fundamental principle of MCP is that site climatology, over a 20 – 25 life, is stationary, i.e. statistics consistent over time

• reference site data must be consistent with this requirement – essential!

19

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FIXED MASTS

Failure to:• examine site photos/visit site• inspect site records• choose site which reflects

‘regional’ winds• choose site with similar

climatology to target site• choose site with good long-term

mean• choose site with long enough

concurrent period available?• choose site with long enough

historic period available?

Last requirement can create problems…

20

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FIXED MASTS

21

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FIXED MASTS

22

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FUTURE PROBLEM?

However:• In US, ASOS masts recently re-

equipped with sonic anemometers• In UK, UKMO has installed

consistent instrumentation at all stations

The problem:• instrument changes may destroy

continuity• can result in limited number of

reference sites being suitable• very sparse networks of low quality

meteorological stations in many areas

Outcome: often little or no suitable reference sites available!

23

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – FUTURE PROBLEM?

However:• In US, ASOS masts recently re-

equipped with sonic anemometers• In UK, UKMO has installed

consistent instrumentation at all stations

The problem:• instrument changes may destroy

continuity• can result in limited number of

reference sites being suitable• very sparse networks of low quality

meteorological stations in many areas

Outcome: often little or no suitable reference sites available!

24

2. REFERENCE SITE DATA: PITFALLS & COMMON PROBLEMS – MESO-SCALE DATA

As alternative, ‘virtual’ mast data may be appropriate:•NCEP/NCAR Re-Analysis 2 data (2.5 deg resolution; 6 hour time base)•Meso-scale data•WorldWind Atlas•Others…

Don’t forget that:•must fulfill same requirements as fixed mast data!•use with caution!•last resort option!!

25

3. MCP – PRE-PROCESSING 1

Get to know your data:

• create time series plots of target and reference site data

• are time series in-phase?

• do time series display the same gross trends?

In practice the process of identifying a good reference site is iterative!

26

3. MCP – PRE-PROCESSING 1

Get to know your data:

• create time series plots of target and reference site data

• are time series in-phase?

• do time series display the same gross trends?

In practice the process of identifying a good reference site is iterative!

27

3. MCP – PRE-PROCESSING 1

Get to know your data:

• create time series plots of target and reference site data

• are time series in-phase?

• do time series display the same gross trends?

In practice the process of identifying a good reference site is iterative!

28

3. MCP – PRE-PROCESSING 1

Get to know your data:

• create time series plots of target and reference site data

• are time series in-phase?

• do time series display the same gross trends?

In practice the process of identifying a good reference site is iterative!

29

3. MCP – PRE-PROCESSING 2

• Create scatter plots of target and reference site data

– may give insight into choice of suitable MCP algorithm BUT...

– scatter plots often misleading and need a 3rd dimension (example)

• generally need to ensure that correlation coefficient, r, is 0.7

30

3. MCP – PRE-PROCESSING 2

• Create scatter plots of target and reference site data

– may give insight into choice of suitable MCP algorithm BUT...

– scatter plots often misleading and need a 3rd dimension (example)

• generally need to ensure that correlation coefficient, r, is 0.7

31

3. MCP – CHOICE OF ALGORITHM - 1

Assuming we have in-phase, suitably pre-averaged QC’d data, what MCP algorithm should we choose?

•several classes of algorithm:

– linear models (y = mx+c)

– non-linear models

– JPD-type models

– neural network models

•within each class, several choices available

•all have strengths and weaknesses!

Choice might depend on what you want to use MCP results for!

32

3. MCP – CHOICE OF ALGORITHM - 1

Assuming we have in-phase, suitably pre-averaged QC’d data, what MCP algorithm should we choose?

•several classes of algorithm:

– linear models (y = mx+c)

– non-linear models

– JPD-type models

– neural network models

•within each class, several choices available

•all have strengths and weaknesses!

Choice might depend on what you want to use MCP results for!

33

3. MCP – CHOICE OF ALGORITHM - 1

Assuming we have in-phase, suitably pre-averaged QC’d data, what MCP algorithm should we choose?

•several classes of algorithm:

– linear models (y = mx+c)

– non-linear models

– JPD-type models

– neural network models

•within each class, several choices available

•all have strengths and weaknesses!

Choice might depend on what you want to use MCP results for!

34

3. MCP – CHOICE OF ALGORITHM - 1

Assuming we have in-phase, suitably pre-averaged QC’d data, what MCP algorithm should we choose?

•several classes of algorithm:

– linear models (y = mx+c)

– non-linear models

– JPD-type models

– neural network models

•within each class, several choices available

•all have strengths and weaknesses!

Choice might depend on what you want to use MCP results for!

35

From Paul van Lieshout’s ‘Wind resource analysis based on the properties of wind or “SKM Weibull’s correlation methodology evaluated”’ paper at All-Energy 2009 conference

3. MCP – CHOICE OF ALGORITHM – 2

• JPD methods, e.g. RES matrix method

36

3. MCP – CHOICE OF ALGORITHM – 2

• JPD methods, e.g. RES matrix method

37

3. MCP – CHOICE OF ALGORITHM – 2

• JPD methods, e.g. RES matrix method

38

3. MCP – CHOICE OF ALGORITHM – 2

• JPD methods, e.g. RES matrix method

39

3. MCP – DATA SUB-CATEGORISATION

• typically data decomposed into sub-categories prior to applying MCP

• typical sub-category is wind direction

• 12 (or 16) directional sectors common

• not always good choice - inspection of the wind rose may inform this

• inspection of diurnal trend may help inform this choice (hourly)

– if pronounced trend, consider a number of ‘time of day’ sectors

– trying to capture periods with similar atmospheric stability

– see Andy Oliver & Kris Zarling’s paper at AWEA 2009!

Regardless of sub-categorisation, need enough data to populate all sectors

40

Target Site

Reference Site

Site Measurements (AAE)

Historic Estimate (HE)

Long Term Estimate (LTE)

Time

Historic Reference Measurements

Concurrent Period Relationship (MCP)

3. MCP – PREDICTION APPROACH & UNCERTAINTY ANALYSIS

2/12222 )}/{}/({ InstMCPMHEMNn nNVV 2/122%

22'' )}/{}/({ InstMAAEMNn nNVV

Target Site

Reference Site

Site Measurements (AAE)

Historic Estimate (HE)

Long Term Estimate (LTE)

Time

Historic Reference Measurements

Concurrent Period Relationship (MCP)

3. MCP – PREDICTION APPROACH & UNCERTAINTY ANALYSIS

2/12222 )}/{}/({ InstMCPMHEMNn nNVV 2/122%

22'' )}/{}/({ InstMAAEMNn nNVV

Target Site

Reference Site

Site Measurements (AAE)

Historic Estimate (HE)

Long Term Estimate (LTE)

Time

Historic Reference Measurements

Concurrent Period Relationship (MCP)

3. MCP – PREDICTION APPROACH & UNCERTAINTY ANALYSIS

2/12222 )}/{}/({ InstMCPMHEMNn nNVV 2/122%

22'' )}/{}/({ InstMAAEMNn nNVV

3. MCP – SECOND STEP PREDICTIONS

44

3. MCP – SECOND STEP PREDICTION VS GAP FILLING

45

3. MCP – CONCURRENT PERIOD & SEASONALITY

For most sites, the pattern of normal seasonal variation means that the precise choice of concurrent period will affect the prediction

•to avoid use only full integer periods of data

•not always practical!

•if have less than a year of data, try to avoid extremes (e.g. summer/winter)

•can generally identify from data whether significant

•Probably more significant for sites with thermally, rather than pressure driven, winds?

46

Comparison of Normalised York HE for All Sites Investigated

0.85

0.90

0.95

1.00

1.05

1.10

0 8760 17520 26280 35040

Number of hours

No

rmal

ised

HE

Win

d S

pee

d

Sunflow er Electric

High Plains

Sleeping Bear

Somerset

Hopkins

High variability initially, starting to converge after 24 months

Comparison of Normalised Matrix HE for All Sites Investigated

0.85

0.90

0.95

1.00

1.05

1.10

0 8760 17520 26280 35040

Number of hours

No

rmalised

HE

Win

d S

peed

Sunflow er Electric

High Plains

Sleeping Bear

Somerset

Hopkins

In first year, long-term predictions can be in error by ± 5 - 10 %!

3. MCP – CONCURRENT PERIOD & SEASONALITY

For most sites, the pattern of normal seasonal variation means that the precise choice of concurrent period will affect the prediction

•to avoid use only full integer periods of data

•not always practical!

•if have less than a year of data, try to avoid extremes (e.g. summer/winter)

•can generally identify from data whether significant

•Probably more significant for sites with thermally, rather than pressure driven, winds?

47

Comparison of Normalised York HE for All Sites Investigated

0.85

0.90

0.95

1.00

1.05

1.10

0 8760 17520 26280 35040

Number of hours

No

rmal

ised

HE

Win

d S

pee

d

Sunflow er Electric

High Plains

Sleeping Bear

Somerset

Hopkins

High variability initially, starting to converge after 24 months

Comparison of Normalised Matrix HE for All Sites Investigated

0.85

0.90

0.95

1.00

1.05

1.10

0 8760 17520 26280 35040

Number of hours

No

rmalised

HE

Win

d S

peed

Sunflow er Electric

High Plains

Sleeping Bear

Somerset

Hopkins

In first year, long-term predictions can be in error by ± 5 - 10 %!

Seasonally-corrected estimate is more stable and shows less spatial variability

Method shows promise!

4. MCP – IDEAL PREDICTION STRAGEY

This might feature:

• a rigorous appreciation of errors/uncertainty is crucial!

• the use of ‘portfolio’ MCP predictions

• predictions based on multiple references sites, real and virtual

• consideration of whether predictions are consistent with expectations

• ‘Round Table’ discussions amongst colleagues

• some iteration is inevitable!

48

CONCLUSIONS

• if approached with diligence and care, MCP can be a vital tool

• it requires attention to detail at every stage of the site assessment process, not just in MCP model building (tiny part overall!)

• you need to understand how to obtain:

– high quality (target) site data

– high quality, appropriately chosen, reference site data

• you need to understand the application and limitations of MCP software

• you need the skills, knowledge & experience to use it & interpret results

• experiments suggest that MCP success is far more to do with choice of high quality reference site than it is to MCP algorithm!

49

50

Recommended