28
Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 75 -The following section is an excerpt from the final report-

Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

  • Upload
    ngotruc

  • View
    227

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 75

-The following section is an excerpt from the final report-

Page 2: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 76

6. AIR CONDITIONING LOAD MANAGEMENT IMPACT AND PROCESS REPORT

6.1 PROGRAM DESCRIPTION The objective of the Air Conditioner Cycling Program, or Air Conditioning Load Management (ACLM) program, is to reduce peak load by curtailing air conditioning during peak usage periods (summer months between June and September). This program focuses on residential customers. <UTILITY> also manages a parallel program for commercial customers.

Peak load reduction is enabled through a load control receiver (LCR) installed outside the home near the central air conditioning unit. The LCR receives signals from <UTILITY> via a paging network to initiate and terminate operation. When the LCR is on, it cycles the air conditioner on and off at set time periods or duty cycles. The receiver is currently set up for a 30% duty cycle, where 30% of the load in a given hour for the participant household is curtailed during load management events. Unlike conventional switches where load is curtailed by switching the AC unit off for 15 minutes and on for 15 minutes during a 30-minute period, the switch vendor (<TECHNOLOGY VENDOR>) uses their TrueCycle technology. <TECHNOLOGY VENDOR> claims that through this technology, the program learns the behavior of air conditioning loads over time in order to calculate a formula to achieve more precise load reduction on oversized AC units.

According to <TECHNOLOGY VENDOR>, the load adjusts the cycle rate based on the historical hourly runtime profile for that home. The device saves historical load shapes for a given home for similar hot summer days, which are then used to develop a historical hourly load shape. More recent historical shapes are given greater weight than older shapes. In other words, the device provides an estimate of the load being consumed by the participating AC units, adjusting that load for units that are not typically turned on, or are not operating at full capacity. It then provides a pre-determined estimate of load capacity (30% in 2011 but can vary as needed) of that level by turning switches on and off across the participant homes.

The program goal is to install 50,000 load control relays on air conditioners to enable cycling during peak demand periods. The program has been in effect since 2002 and has installed units in more than 36,000 premises. Recruiting and installation are ongoing until program goals are reached.

The <UTILITY> program is mostly used as an emergency program, so events are not necessarily called during every summer season. No events were called between 2008 and 2010. In 2011, three load events were called. These three events were called to reduce peak load; an additional two events were called for a sample population to test load outcomes under different conditions (temperature and duty cycle percentages).

Customers receive a $5 per month/per unit electricity bill credit from June to September, or $20.00 per program season. The program is open to commercial customers, though residential customers comprise the overwhelming majority of those who have enrolled. Those who rent their property are eligible to enroll, contingent on written permission from the landlord. This evaluation covers the program period from January through December 2011.

Page 3: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 77

6.2 PROGRAM BUDGETS, GOALS, AND SCORECARDS The <UTILITY> Program budget for the ACLM program for 2011 is $1,569,000. As of December 2011, the program spent 85% of its budget, or $1,317,441. The budget includes incentive payments to customers as well as installation and operational expenses. <IMPLEMENTER>, the third-party vendor, provides all costs on a cost-per-unit basis and bills <UTILITY> on a lump-sum basis once the units have been installed. <IMPLEMENTER> sends monthly invoices to <UTILITY>.

<IMPLEMENTER> is paid for each of the following:

Per device installed Per service visit Designated marketing budget

<UTILITY> has an approximate customer base of 430,000, and has judged that the saturation point for the ACLM program would be approximately 50,000 participants. Since its inception in 2002, the enrollment goal has been 5,000 customers per year. <IMPLEMENTER> expects an initial dropout rate of 15%-20%, comprising those who enroll in the program but then prove to be either ineligible or uninterested at the time of installation. This – combined with an approximate 1% dropout rate among participants who had already had the device installed – means that <IMPLEMENTER> attempts to enroll between 5,500 and 6,000 participants to meet the goal of 5,000 participants per year.

The program had installed a total of 35,95518 devices from program start (2002) through 2011, and enrolled approximately 3,599 participants in 2011 alone, or 72% of its annual enrollment target. It should be expected that as the program nears its saturation point that recruiting would become more challenging.

<UTILITY> called three load management events in 2011.

Table 85 provides program goals and reported results from the program.

Table 85: 2011 Reported Program Goals and Results

Goal Actual % of Goals

Budget $1,569,000 $1,317,441 84%

Enrollment Targets 5,000 3,599 72%

kWh 54,395 39,585 73%

kW 5,000 3,599 72% Source: <UTILITY> 2011 Scorecard

The goals and scorecards are based on the number of customers who enrolled in the program in 2011, and do not include the entire participant population since program inception. See comments on the scorecard under the Process Insights & Recommendations section of this chapter.

18 Current active population is lower given dropouts over the program period.

Page 4: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 78

6.3 EM&V METHODOLOGY The table below highlights EM&V activities associated with the ACLM program.

Table 86: Program Evaluation Overall Tasks

Action Details

Program Manager and Implementer Interviews

Interviewed Program Manager Interviewed Implementer (<IMPLEMENTER>) Interviewed Device Manufacturer (<TECHNOLOGY VENDOR>)

Program Database Review / Verification Reviewed treatment group tracking information

Program Material Review Reviewed materials to assess marketing and outreach efforts Reviewed materials associated with training

Impact Analysis

Collected hourly load data for treatment group for 2011 from <UTILITY>

Collected hourly temperature and humidity data for the <UTILITY> service territory (<STATE> Airport station) for summer of 2011

Developed average load shape for participant sample (i.e., referential load)

Assessed difference between referential load and actual usage to determine impacts

6.4 IMPACT ANALYSIS The following section summarizes the methodology and results from our impact analysis for the three events called in 2011. <UTILITY> provided the evaluation team with hourly metered data19 for the cycling period (June 1, 2011–September 30, 2011) for a sample of residential ACLM customers. The sample included 67 participating customers. <UTILITY> also provided hourly temperature and humidity data for 2011 from Indianapolis Airport Station, Indiana.20

Given the nature of the load control receiver, <UTILITY> does not have the ability to calculate the load shed by the participant population for the devices already installed, nor to verify on an ongoing basis whether the devices are still installed (in case HVAC units are serviced and/or replaced), or are operational. Thus, as a result, impact estimates of load-shed events are based on a statistical evaluation of a sample of premises that have smart meters installed. These smart meters collect hourly interval data on the entire premise; thus, all loads (lighting, refrigeration, and other end uses) in a premise are included, not simply HVAC ones.

The impact analysis is based on a treatment group of 67 program participants. Given that the load control receiver can only receive data, not transmit it, in order to obtain data for impact analysis, a statistically representative sample must be derived.

19 The evaluation team received hourly data from <UTILITY>. The team plans to discuss the availability of 15 minute data with <UTILITY>. 20 Note that for two days, September 29 and 30, no weather or humidity data was provided. These days were excluded from our analysis.

Page 5: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 79

The sample selected for analysis is, according to <UTILITY>, a stratified random sample that is meant to be representative of all ACLM participants. The original sample, according to <UTILITY> documents, included 80 CoolCents participants randomly selected (4 from each of 20 billing districts). For the impact analysis, the sample received had been reduced to 67 participants. The treatment group focuses on residential customers only, whereas the ACLM program also has small commercial participants in its database (<UTILITY> manages a distinct ACLM program for large C&I customers). <UTILITY> is also considering recruiting multifamily premises to meet its recruiting targets, which would in general yield less load per premise than an average single-family home. Finally, there is generally a high correlation between average energy consumption of a premise and the observed load impact for a given premise; thus, treatment group customers should be representative across energy usage strata represented in the population.

The evaluation team has requested the full participant database with the above-mentioned variables to conduct an analysis of whether the current treatment group is statistically representative of the entire population. At the time of this report, this data was not available to the evaluation team. As such, the team could not verify whether the selected sample can be reasonably expected to represent the entire population.

Given that smart meters are allocated for the sample customers, it would not be possible to obtain a different sample for 2011 analysis. As such, despite some concerns about the statistical validity of the treatment sample, these impacts were extrapolated to the participant population. However, should impact analysis be conducted for the 2012 program period, the evaluation team recommends that an equivalency check between sampled participants and the participation population be conducted focusing on representation across geographic areas (or billing districts) as well as stratification along demographics and energy consumption. Should <UTILITY> also target markedly different segments, such as multifamily, which are expected to yield lower capacity per home, then the sample size may need to be increased and a sample stratification scheme devised to reflect differences in the participant population.

Methodology The evaluation team developed a statistical model to estimate a reference (or baseline) hourly kW load during event periods. The statistical model predicts what hourly kW usage would have been on the event day, particularly during the event, if no event had been called.

To construct hourly reference kW usage for each event day, the evaluation team conducted separate regressions for each participant in the sample. This approach was used because households tend to use energy differently, and will therefore respond differently to weather conditions and to events. Individual regressions provide the most accurate hourly prediction because they estimate both individual intercepts as well as individual slopes.21 Once individual regressions were conducted, the evaluation team pooled the results over the sample to estimate mean usage for each hour of the day.

Once a predicted reference load for each event was developed, the evaluation team then determined demand reduction during each event by subtracting predicted usage from actual usage

21 Note that a fixed-effects panel model provides individual intercepts, but slopes are estimated over the sample.

Page 6: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 80

during the event for each hour. The percent reduction was calculated by dividing this difference by the predicted usage:

%

Observations Included in Analysis

All 2011 events were called during one week in July (Monday, July 18; Thursday, July 21; and Friday, July 22). For the regression models, the evaluation team excluded weekends and holidays because usage behavior is typically different on weekends and holidays than on weekdays, and events are not called during these periods. 22

The regression equation relies on non-event-day non-holiday weekday data to estimate load on an AC Cycling event day through observations of consumption given similar conditions (such as temperature, humidity, etc.). The observation days included in the analysis vary across the events, but resulted in 17 total days (excluding the three event days). Observation days vary given the differences between the mean temperature by month during the cycling period (see Figure 11). Each month has a distinct weather profile, with July 2011 being the hottest month in the summer period.

Figure 11: Hourly Average Temperature by Month

Model fit improves when observation days used in the regression are close to the event values (i.e., temperature, humidity, etc.) The observations for July were similar to the Event 1 profile. A different set of days were required for estimation of the models for Events 2 and 3 because they exhibited higher than average temperature. As such, regression estimates that were based on higher

22 Events on two other days were excluded – 8/24/2011 and 8/23/2011 – as these were test events and are not within the scope of this evaluation.

50

55

60

65

70

75

80

85

90

95

100

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Tem

pera

ture

(Deg

rees

Fah

renh

eit)

Hour

June 1- September 28, 2011, excludes weekends and holidaysIndianapolis Airport Weather Station

June

July

August

September

Page 7: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 81

temperature observations provide a better fit for the estimated load for the participating population. Table 87 provides daily mean and mean peak period (12 to 7 p.m.) temperatures for typical event periods, as well as for each month.

Table 87: Temperature (Degrees Fahrenheit) by Date

Date Daily Mean Temperature

Mean Temperature from Hours 12-7

Events

Event 1: July 18, 2011 84 91

Event 2: July 21, 2011 89 98

Event 3: July 22, 2011 88 95

Months

June 74 81

July 82 90

August 77 87

September 64 72

Price (2010) points out that the effects of weather are different at different temperature points on the scale.23 In other words, the slope representing change in usage is different at different temperature ranges. Events 2 and 3 took place on days with very high temperatures and the evaluation team also found that the load curves were different on these days. Therefore, the analysis sample frame included days where the maximum Temperature Humidity Index (THI)24 was greater than 80. This value was chosen because it provided the highest temperature days available, while still providing an appropriate number of observations for the analysis.25

Model Selection Many models were tested, and the final models were chosen based on fit with actual usage, especially during the hours leading up to the event. Fit was judged on R2, Adjusted R2,26 and on Akaike’s Information Criterion (AIC).27 The models suppress the constant term to obtain a more

23 Price, P. “Methods for Analyzing Electric Shape and its Variability”. Ernesto Orlando Lawrence Berkeley National Laboratory, Environmental Energy Technologies Division, May 2010. Ernesto Orlando Lawrence Berkeley National Laboratory, http:/drrc.lbl.gov/sites/drrc.lbl.gov/files/LBNL-3713E_0.pdf

24 Temperature Humidity Index is calculated as follows. THI = (TEMPERATURE_F - 0.55 * (1 - RELATIVE_HUMIDITY_PCT/100) * (TEMPERATURE_F - 58.0)). PJM Manual 18B: Energy Efficiency Measurement & Verification. Revision: 01, Effective date: March 1, 2010, pp. 39. https://www.pjm.com/~/media/documents/manuals/m18b.ashx 25 Note that to identify a large enough sample of days that met the criteria of maximum THI greater than 80, the evaluation team expanded the months within the selection to July and August, 2011.

26 R2, or the coefficient of determination, is typically used as a goodness of fit for regression models. R2 is the ratio of the explained variation compared to the total variation. Adjusted R2 imposes a penalty for adding additional independent variables to a model. The value of R2 and Adjusted R2 is always between zero and one, where one is a perfect fit of the data. For more information, see Wooldridge, J.M., “Introductory Econometrics: A Modern Approach”, 3rd Edition, 2006. 27 AIC score, or Akaike’s Information Criterion, balances finding a parsimonious model with the most predictive model. Akaike’s (1974) information criterion is defined as AIC = -2 lnL + 2k where lnL is the maximized log-likelihood of the model and k is the number of parameters estimated. Akaike, H., (1973) Information theory and an extension of the maximum likelihood princ<UTILITY>e, in: B.N. Petrov and F. Cs'aki, eds., Proc. 2nd International Symposium of Information Theory, Akad'emiai Kiad'o, Budapest, 267-281.

Page 8: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 82

precise fit and to accommodate errors in prediction almost entirely associated with usage on the hour that would be omitted with a constant term. The models are specified to replicate actual usage during non-event hours, especially the hours before the event, so that there is a high level of confidence in the reference points during event hours. The evaluation team ran model diagnostics and determined that heteroskedasticity and autocorrelation (HAC) were present in the model. Therefore, the evaluation team computed consistent (HAC) standard errors for each model.28

Two models were selected to predict referential load during event days to address specific event day characteristics. Both models incorporate weather variables with weather as the major predictor of energy consumption. The model uses a THI weather variable for the weighted THI three hours prior to the hour, and the weighted average THI 24 hours before each hour of the day. These weighted THI variables take into account the build-up of temperature and humidity in the home over time. An 85oF temperature is experienced differently when the prior day was 85oF than when it was 70oF. The model also includes the hour of the day, as time of day is highly predictive of usage.

The first model, Equation 3, predicts referential load for Events 1 and 2. The regression model is described as:

Equation 3: Regression Model for Event 1 and Event 2

kW β Hour β Hour ∗ THIlag24 β Hour ∗ THIlag3

β Hour ∗ THILag3 ∗ EventHour h ϵ

where:

Variable Definition kWh Predicted hourly energy consumption during hour h β Coefficient or Vector of coefficients

Hourh Variable indicating hours of the day, estimating effects of usage behavior with weather and usage

THIlag24h Variable indicating a weighted average of the Temperature Humidity Index for the previous 24 hours from hour h

THIlag3h Variable indicating a weighted average of the Temperature Humidity Index for the previous three hours from hour h

EventHourh Indicator variable to model hourly effects of events when events are called in hour ha

h Error term for hour h a EventHourh (the participation variable) took the value of 1 during the hours an event was called. It was 0 during other hours in the analysis.

28 As part of our model diagnostics, the evaluation team computed heteroskedasticity and autocorrelation consistent (HAC) standard errors through using Newey-West standard errors, using a maximum lag of 25 (to represent a lag length exceeding the periodicity of the data – in this case hours). Used in this context the standard errors are robust to both arbitrary autocorrelation for the chosen lag and arbitrary heteroskedasticity. For more information, see Wooldridge, J.M. Introductory Econometrics, 3d ed., pp. 20-21.

Page 9: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 83

For Event 3, the evaluation team incorporated additional variables into the model. This is necessary because the event was called on a Friday. As noted in Figure 12 below, the average load shape for Fridays tends to be markedly different from those of the other weekdays, with a more pronounced decrease in consumption in the late afternoon/early evening hours. This figure provides an overview of load consumption by day for weekdays across the cycling period (June–September 2011).

Figure 12: Load Consumption in Residences by Day (Weekdays Only)

To capture this Friday pattern, the model incorporated a day-of-week dummy variable, where Friday can equal either a 1 or 0 based upon the day in question, and is interacted with post-event hours. The evaluation team incorporated this second variable because many of the Event 3 models tested were substantially under-predicting the post-event hours.

Equation 4 provides the estimating equation used for Event 3.

Equation 4: Regression Model for Event 3

kW β Hour β Hour ∗ THIlag24 β Hour ∗ THIlag3 β Hour ∗ THILag3

∗ EventHour β PostHour ∗ Friday h ϵ

The model is the same in all respects to Equation 3, except for the following variable:

PostHourhXFridayh: Variable indicating post-event hours interacted with Friday, a variable indicating day of the week, where Friday is either a 0 or a 1 for each day and hour h

Model Validation

11

.52

2.5

Ave

rag

e kW

pe

r d

ay

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23Hour

Monday TuesdayWednesday ThursdayFriday

Page 10: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 84

The evaluation team reviewed a variety of metrics to determine if the models predicted actual savings well, through assessing the following “goodness of fit” metrics, R2 and Adjusted R2 (see note above under models for a review of these metrics).

Table 88: Goodness of Fit’ Metrics for Each Event

Pooled Average R2 Adjusted R2

Event 1 0.894 0.869

Event 2 0.897 0.874

Event 3 0.900 0.875

The typical threshold for goodness of fit for customer-specific regression models is an adjusted R2 greater than 0.70. All of our models exceed this threshold. Our models were primarily chosen by best AIC statistic. Parlin (2006)29 suggests the rule of thumb that a difference in AIC of 10 or more between two competing models suggests a preference for the model with the lower AIC. The evaluation team selected the model that predicted best with the highest AIC score within 10 points.

In addition to reviewing goodness of fit metrics, the evaluation team also reviewed how well the model predicted load during non-event hours. Table 89 provides the average pre-period and post-period difference in predicted versus actual load during the event periods.

Table 89: Difference in Predicted Versus Actual Load During Event Days

Pooled Pre-Period Difference

Post-Period Difference

All Non-Event Period

Difference

Event 1 -0.31% 5.94% 1.88%

Event 2 2.36% 1.48% 2.03%

Event 3 -7.35% 2.07% 5.24% Note: If the value is negative, the regression under-predicted usage by the % shown.

As part of the efforts to assess the validity of the models, the evaluation team reviewed how well the models predict load reduction during event hours. Figure 13 through Figure 15 show the predicted versus actual values during each of the three events called in 2011. For Event 1, actual load is lower than the predicted load by about 1% across all non-event hours. For Events 2 and 3, there is virtually no difference between the actual and predicted load.

29 Parlin, K. “Model Selection for the Impact Evaluation of Energy Efficiency Programs,” 2006.

Page 11: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 85

Figure 13: Event 1 Actual versus Predicted Load for Event Hours (17 Days)

Figure 14: Event 2 Actual versus Predicted Load for Event Hours (17 Days)

0

0.5

1

1.5

2

2.5

3

3.5

4

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Avg.

kW

per

Pre

mis

e

HourEvent Hour Actual Event Hour Prediction

0

0.5

1

1.5

2

2.5

3

3.5

4

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Avg.

kW

per

Pre

mis

e

HourEvent Hour Actual Event Hour Prediction

Page 12: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 86

Figure 15: Event 3 Actual versus Predicted Load for Event Hours (17 Days)

The evaluation team also conducted a second validation exercise that is based on the idea that a model estimated on one sample should work well on other samples. When another sample is not available, it is common to test the model on a subsample within the original one. This tests the possibility that the model is so sample-specific that it does not work even within subsets of the original modeled sample. We found no systematic bias in the modeling (see Appendix A for more details).

Sample Savings by Event The program savings assumptions are based upon an expected load reduction of 1 kW per home per hour. These assumptions originate from the load control receiver vendors as well as <UTILITY>’s assessment of an AC Cycling event called in 2006, which yielded an average of 1.05 kW per home on a 50% duty cycle. There are mult<UTILITY>e variables that influence observed load impact (as seen in the regression equations above and discussions throughout this document), such as temperature and humidity, the participant profile (in terms of size of HVAC unit, unit usage patterns, housing stock, etc.) as well as duty cycle, duration of event, and time of day when event is called.

Three events were called in 2011 for the Residential ACLM program. Table 90 provides the date and time of the event and the analysis time frame for each event. Because the data received was at an hourly level, the hour served as our unit of analysis.

Table 90: ACLM Event Dates and Times

Event Date Time of Event Analysis Time Max Temp Type of Duty Cycle

Event 1 7/18/2011 13:31 to 17:00 13:00 to 17:00 92oF 30% Event 2 7/21/2011 12:47-16:45 12:00 to 17:00 99oF 30% Event 3 7/22/2011 12:08-15:30 12:00 to 16:00 96oF 30%

0

0.5

1

1.5

2

2.5

3

3.5

4

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Avg.

kW

per

Pre

mis

e

HourEvent Hour Actual Event Hour Prediction

Page 13: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 87

Figure 16 shows the actual event load during the period and the predicted load using Equation 3 as described above. There is an observed snapback following the event period, which is commonly observed when households experience HVAC curtailments. Presumably, the curtailment results in a rise in temperature compared to what it would have been absent the curtailment, so the household increases usage to compensate. This snapback may put stresses in the electric system depending on what time it occurs. In this case, it is possible that other load (i.e., commercial/industrial) may have greatly dissipated by this time and, therefore, the spike is not problematic from a systems operation standpoint. However, should this not be the case, the program should consider the timing of ending events so as not to essentially shift the peak, rather than smooth it out.

Figure 16 also illustrates another salient point related to the observed load dropped at the time the event is called. Once an event is initiated, the switches begin cycling at a randomized rate, so they do not all start at the same time. In practical terms that means that the initial load dropped is more of a sloped reduction, rather than an immediate load drop for the entire participant population (if that were the case, then the load drop would resemble a straight-line delta from predicted load upon event trigger rather than the sloped line shown in the figure below). Further research is required to determine the rate at which switches are turned on during the randomization process. This may have an impact in the overall duty cycle percent achieved during partial hours. In addition to different start times, events were not called at the beginning of the hour (i.e. Event 1 commenced at 13:31 p.m., rather than at 13:00 p.m.) and smart meter data was provided at an hourly interval (rather than 15 minute intervals). These factors make it difficult to accurately estimate the load-shed for the hour. For example, for Event 1 the hourly interval only reflects 29 minutes of the event, and cannot be accurately estimated, as it underestimates the load shed for the partial hour. This partial hour issue does not directly impact kWh energy savings. They are registered as reductions for that hour from what the model predicts it should have been. Partial hours however, have an impact in claiming the average observed kW demand per premise. This is because the inclusion of that partial hour, where the program was not run for the full 60 minutes of the hour, would artificially reduce the average observed load for the program period. Thus, while we report savings for each program hour (noting where it is a partial hour), for calculations of average demand savings over the program period, we exclude the partial hours from the average. This will be applicable to the reported values for Events 1, 2, and 3 in this report.

Page 14: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 88

Figure 16: Average Actual and Predicted Load for Event 1

Table 91 shows reductions achieved for each event at each hour.

Table 91: Event 1 Reductions and Recovery by Hour, July 18, 2011, scheduled between 13:31 to 17:00

Time Frame Hour Beginning Average Hourly kW Reduction % Savings

Event Hours

13:00* -0.11 -3.87%

14:00 -0.58 20.30%

15:00 -0.53 17.53%

16:00 -0.41 13.26%

Averagea -0.51 17.03%

Standard Error of Differenceb 0.09 4.30%

Recovery Hours

17:00 0.25 -8%

18:00 0.31 -10%

19:00 0.15 -5%

20:00 0.06 -2%

Note: * denotes partial hour, which is excluded from average kW reduction and % savings calculation. a Average excludes partial hour 13:00. b ∗ ∗ 1 ∗ Where n=sample size and N=population size.

0

0.5

1

1.5

2

2.5

3

3.5

4

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Avg.

kW

per

Pre

mis

e

Hour

Event 1: July 18, 2011

Event 1 Actuals Event 1 Predictions

Page 15: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 89

In summary, for Event 1, total kWh load shed was 1.62 kWh per premise for the event period. This value, applied during the event period to the estimated participant population, yields gross energy savings for that event of 55 MWh. There was a snapback effect of 0.77 kWh per premise after the event period. Thus, net energy savings for Event 1 is 29 MWh. These effects are summarized in Table 92.

Table 92: Gross Program Savings – Event 1, July 18, 2011

Event 1 Total kWh Load Shed

per participant during event period

Confidence Interval –

Lower Bounda

Confidence Interval –

Upper Bounda

Total Participating Customers

Total Gross Energy Savings (kWh)

Event Hours 1.62b 0.972 2.27

34,249

55,570

Snapback 0.77 -0.464 -1.084 -26,513

Net Savings 0.85 0.508 1.188 29,057 a Confidence Intervals were measured at 90% one-tail. b Note that this includes partial hourly kW for hour 13 (estimated at -0.11 kW).

Figure 17 shows the actual and predicted load on Event 2, which was called on Thursday, July 21, 2012, from 12:47 p.m. through 16:45 p.m. Compared to typical events, this event was called earlier in the day and stopped before it met with a typical peak load. This may be a reason why observed load dropped for the event is less than Event 1, despite it being called on a warmer day with a maximum temperature of 99oF.

Figure 17: Average Actual and Predicted Load for Event 2

Table 93 shows reductions achieved for each hour for Event 2.

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Avg.

kW

per

Pre

mis

e

Hour

Event 2 - July 21, 2012

Event 2 Actuals Event 2 Predictions

Page 16: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 90

Table 93: Event 2 Reductions and Recovery by Hour, July 21, 2011, scheduled between 12:47 and 16:45

Time Frame Hour Beginninga Average Hourly kW Reduction % Savings

Event Hours

12* 0.06 -2%

13 -0.33 10%

14 -0.44 13%

15 -0.29 8%

16* -0.29 8%

Averagea -0.35 11%

Standard Error of Differenceb 0.12 5.8%

Recovery Hours

17 0.17 -5%

18 0.11 -3%

19 -0.02 0%

Note: * denotes partial hour *, which is excluded from average kW reduction and % savings calculation. a Average excludes partial hours of 12:00 and 16:00. b ∗ ∗ 1 ∗ Where n=sample size and N=population size.

In summary, for this event, total kWh load shed was 1.29 kWh per premise for the event period. This value, mult<UTILITY>ied by the participant population, yields gross energy savings for that event of 44 MWh for Event 2. There was a snapback effect of 0.26 kWh per premise after the event period. The net savings for event 2 is 35 MWh.

Table 94: Gross Program Savings – Event 2, July 21, 2011

Event 2 Total kWh Load Shed during event period

Confidence Interval –

Lower Bounda

Confidence Interval –

Upper Bounda

Total Participating Customers

Total Gross Energy Savings (kWh)

Event Hours 1.29b 0.33 2.25

34,270

44,240

Snapback -0.26 0.07 0.46 -9,025

Net Savings 1.03 0.40 2.71 35,215 a Confidence Intervals were measured at 90% one-tail. b Note that this includes partial hourly kW for hour 12 (estimated at 0.06 kW) and hour 16 (estimated at -0.29 kW).

Figure 18 shows the actual and predicted load on the day of Event 3, which was called on Friday, July 22, 2012, from 12:08 p.m. through 15:30 p.m. This event was called relatively earlier in the day and stopped before it met with a typical peak load. This may be a reason why observed load dropped for the event is less than Event 1, despite it being called on a warmer day with a maximum temperature of 96oF.

The predicted load after the event seen in Figure 18 is clearly an under-estimate of Friday pre-period load. This problem was addressed through many model re-specifications designed to make the prediction conform more closely to reality. However, when any method was found to deal with that

Page 17: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 91

issue, it created problems in the post-event period. Subsequent model validations confirmed that this model, while apparently under-predicting event-period load, matches pre- and post-period load most closely.

Figure 18: Average Actual and Predicted Load for Event 3

Table 95 reveals the reduction achieved for each event for each hour of Event 3. Again, event times were not called on the hour.

0

0.5

1

1.5

2

2.5

3

3.5

4

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Avg.

kW

per

Pre

mis

e

Hour

Event 3: July 22, 2011

Event 3 Actuals Event 3 Prediction

Page 18: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 92

Table 95: Event 3 Reductions and Recovery by Hour, July 22, 2011 scheduled between 12:08 and 15:30

Time Frame Hour Beginning Average Hourly kW Reduction % Savings

Event Hours

12* -0.30 10%

13 -0.32 10%

14 -0.49 15%

15* -0.24 7%

Averagea -0.41 12%

Standard Error of Differenceb 0.14 5%

Recovery Hours

16 0.33 -10%

17 0.17 -5%

18 0.12 -3%

Note: * denotes partial hour, which is excluded from average kW reduction and % savings calculation. a Average excludes partial hours of 12:00 and 15:00. b ∗ ∗ 1 ∗ Where n=sample size and N=population size.

In summary, for Event 3, total kWh load shed per participant was 1.35 kWh for the event period. This value mult<UTILITY>ied by the participant population yields gross energy savings for that event of 46 MWh. There was a snapback effect of 0.62 kWh per premise after event period. Thus, the net savings for Event 3 is 25 MWh.

Table 96: Gross Program Savings – Event 3, July 22, 2011

Event 3

Total kWh Load Shed per premise

during event period

Confidence Interval –

Lower Bounda

Confidence Interval –

Upper Bounda

Total Participating Customers

Total Gross Energy

Savings (kWh)

Event Hours 1.35 0.50 2.19

34,271

46,108

Snapback 0.62 0.23 1.01 21,253

Net Savings 0.73 0.27 1.18 24,855 a Confidence Intervals were measured at 90% one-tail. b Note that this includes partial hourly kW for hour 12 and 15 (estimated at -0.30 and -0.20 kW, respectively).

Application of Savings to Population Table 97 and Table 98 apply average kW and total kWh savings by event to the program population. Average total gross kWh during the event was 145 MWh. Including snapback effects, net savings total for the program period is 89 MWh. Total demand savings vary depending on event day and overall participant population. The program as established yielded between 12 MW and 17 MW in load shed on event days.

Page 19: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 93

Table 97: kW Savings by Event

Sample Avg. kW per Event Hour

(A) Population (B) Population

Savings (A*B) Event 1 0.51 34,249a 17,325 Event 2 0.35 34,270b 12,079 Event 3 0.41 34,271c 13,894 a Represents number of participants with installations prior to 7/18/2011a, 7/21/2011b, and 7/22/2011c, respectively, as per participant database received from <IMPLEMENTER> in June, 2012.

Table 98: kWh Savings by Event

Event Sample kWh per Event Hour (A) Population (B)

Population Savings (A*B)

Event 1 1.62 34,249a 55,570 Event 2 1.29 34,270b 44,240 Event 3 1.35 34,271c 46,108

Event kWh Savings 145,918 Snapback 56,790

Net Savings 89,127 a Represents number of participants with installations prior to 7/18/2011a, 7/21/2011b, and 7/22/2011c, respectively, as per participant database received from <IMPLEMENTER> in June 2012.

Table 99 provides tracked and evaluated energy and demand savings for the 2011 program year. The program scorecard calculated savings for program participants who enrolled in the program in 2011 only (or 3,599 customers), not the existing active participant population of over 34,000 customers. As a result, reported values were significantly underreported when compared with actual values given that savings should have been applied to the entire participant population. For energy savings, the realization rate is 2.52, based on evaluated energy savings of 89,127 across three events applied to the participating population at the time the events were called.

Table 99: Tracked and Evaluated Energy and Demand Savings

Tracked Quantity

Evaluated Quantity

Tracked kWh

Evaluated kWh

Tracked kW

Evaluated kW

Realization Rate kWh

Realization Rate kW

ACLM 3,599 34,263 39,585 89,127 3,599 12,079-17,325 2.52 3.36 – 4.81

6.5 PROGRAM IMPLEMENTATION The ACLM program uses <IMPLEMENTER>, a third-party vendor, to install a <TECHNOLOGY VENDOR> 5200 load control receiver, which is used to cycle air conditioners and manage load during peak usage months of June through September.

<IMPLEMENTER> has the following contractual responsibilities:

Overall implementation oversight Marketing/advertising, including developing and executing marketing strategies, which are

complementary to <UTILITY>’s marketing efforts Customer service (call center/web enrollment)

Page 20: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 94

Device installation

Device quality control audits and service

Establishment of customer database

Customers can enroll in one of three ways:

1. Calling the <IMPLEMENTER> ACLM customer service number 2. Returning the self-mailer enrollment form 3. Through the <UTILITY> website

At the time of enrollment, <IMPLEMENTER> ensures that the customer is an <UTILITY> customer, that the unit is operational, and that the unit is accessible to the technician. <IMPLEMENTER> schedules installation within 45 days of contact. Customers start to accrue credits on their bill immediately, before the device is installed. <IMPLEMENTER> updates <UTILITY> regularly through electronic enrollment files. The device is installed if the unit is accessible and in good repair. The technician then updates their files to track the device ID and basic characteristics of the AC unit.

Cycling Strategy

Once a device has been installed, <UTILITY> may choose to cycle a participating unit any time between June and September, between 12:00 p.m. and 7:00 p.m. The same cycling strategy is used for all program participants.

TrueCycle

The <TECHNOLOGY VENDOR> 5200 load control receiver can receive a signal that tells the unit to track usage for a day, even if an event is not being called. <UTILITY> sends these signals on days that are projected to have similar weather as potential load management days, but in which an actual event will not be called. This usage data is then used during an actual event to estimate a cycling schedule, which should result in reduction based on specified duty cycle in use of the compressor. The initial usage measurement also allows <UTILITY> to identify units that are likely to not be in use, which allows <UTILITY> to more accurately measure the amount of load that is expected to be shed.

<UTILITY> is currently experimenting with alternate cycling strategies with a sample of 67 participants. These are also the same participants whose sample data were used to conduct the impact analysis. These participants are activated on normal event days, but are also activated on other hot days in which an event will not be called. The cycling percentage varies in these test events, from 30% to 40%. Two test events were called in late August (August 23 and August 24), but these were not included in the impact evaluations. The data for these event days were also removed from summer hourly data for analysis purposes. <UTILITY> then calculates the probable load that could be managed using these alternate cycling schedules.

6.6 PROGRAM INSIGHTS & RECOMMENDATIONS Program insights and recommendations are based on interviews with program managers, program implementers, and the technology vendor, along with a review of program materials and documentation including scorecards, marketing plan/materials, and the tracking database. In addition, recommendations are also based on a best practice review of AC Cycling strategies and evaluations from utilities across the United States and the evaluation team’s experience in

Page 21: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 95

conducting demand response program impact analysis. Below are insights from the evaluation research, and where relevant, recommendations.

1. Program observed impacts are based on several variables which can be changed depending on the needs of the program

The average observed load impacts ranged from 0.29 kW to 0.51 kW per premise per hour. These values are lower than previous observed values associated with the program from evaluations performed in 2006 and 2007 by <UTILITY> (where average values were 0.74 kW per home and 1.05 kW per home per event period), respectively. These differences may be due to several factors, for example:

1. 2011 events were called earlier in the day, with start times between 12:00 noon and 1:00 p.m., whereas events in earlier years were peak coincident (between 3:00 and 6:00 p.m.). There is generally less AC load earlier in the day than later in the day as temperature rises, which may result in less load shed on average per unit for events called earlier in the day.

2. 2011 events used a 30% duty-cycle. A lower duty-cycle will necessarily provide less load shed per premise than a higher duty-cycle. Previous observed values of 1.05 per home were based on 50% duty-cycle strategies.

3. Hourly data from treatment groups is provided by smart meters. The load data do not reflect HVAC loads only, but also include other end-uses (refrigeration, lighting, etc.). Interestingly, the observed load drop on the warmest day (maximum temperature of 99oF) was lower than in a relatively cooler day (90oF). Customers may be inclined to stay indoors during extreme temperatures, which subsequently increases other loads used in a premise, even if HVAC load is curtailed.

Recommendations:

A participant survey should be conducted with customers following AC events to understand their premise occupancy and appliance usage patterns on event days.

To the extent possible, only HVAC load should be metered to avoid noise associated with increase of other end-use loads for sample customers.

<UTILITY> should ensure that the treatment sample is representative of the overall treatment population related to average daily usage, and representative of commercial and residential customers.

<UTILITY> should experiment with calling events at higher duty-cycles should it not be too inconvenient to the customers (which can be gauged by the participant surveys referenced above).

2. It is unclear whether the treatment group is statistically representative of the larger participant population

As noted in the Impact Analysis section of this chapter, the evaluation of the Residential ACLM program is based on the data from 67 customers. We were unable to verify whether this sample is statistically representative of the entire population as we did not receive some key demographic information on the total participant population from <UTILITY>, most notably related to building type, switch installed on a primary and/secondary unit, zip codes, and average energy consumption.

Page 22: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 96

As noted earlier, the load control receiver has limitations in the time period in which it can collect data from specific units and it requires a costly approach to go on site and download data from receivers. Thus, <UTILITY> outfitted the treatment homes with smart meters, which is an acceptable approach to collect interval data (but it also provides some limitations since the internal energy and demand data collected includes all end-uses in a premise, not only HVAC loads). Thus, after a review of the sample against the tracking database, it may be that the treatment group sample insufficiently represents the larger population. In this case, new meters need to be installed in an adjusted and/or enhanced new treatment group sample.

We note that the treatment group sample size may need to change based upon our findings that the participant population is not homogenous, and may change in the future.

Ideally, a statistically representative control group, which is equivalent (in terms of location, HVAC unit and premise profile, etc.) to the treatment group should be derived to obtain a robust estimate of load impact. This, of course, is more costly because unless the control group premises are already outfitted with smart meters or HVAC loggers before the treatment period, a technology and installation cost must be incurred to obtain data from this group.

Recommendation: <UTILITY> should review the statistical representation of its current treatment group. Should it determine that it is not representative, then the treatment group should be adjusted and/or increased so that future evaluation results can be deemed statistically representative of the participant population.

Cost permitting, <UTILITY> should establish a statistically representative control group with equivalent characteristics as the treatment control and collect hourly data (or similar data collected for treatment group) for future impact evaluation purposes.

3. Scorecard reports numbers and savings based on current year enrolled participants, rather than total participant population; this significantly understates the program savings

Given that the program has annual enrollment targets, it makes sense that this metric is included in the program scorecard. However, when it comes to estimating overall program energy and demand savings, these estimates should apply to the entire actively enrolled population, which is estimated to be over 34,000, not only those enrolled in the current program year. This is because even if a participant enrolled in a previous program year, they continue to participate in the events of subsequent program years. For example, a participant who enrolled in 2010 continues to participate in events in 2011. The scorecard significantly underestimates program capacity and energy savings potential, potentially by a factor of 10.

Recommendation: While <UTILITY> may choose to track program enrollment on an annual basis, we recommend that the scorecard includes energy and demand savings at the active participant population level, not only the customers enrolled during a given program year. Thus, it should include annual performance (enrollments in 2011) as well as cumulative performance (enrollments to date removing customers who opt-out, replaced their HVAC unit, or had a malfunctioning unit), and by applying energy and demand savings to the entire participant population.

4. The per unit impact estimates (of 1 kW per home) appear to be overestimated given the duty cycled used to call 2011 events

The program in general assumes that participant premises produce savings of 1 kW per home. The 2011 scorecard attributed this value to the year’s enrollees and estimated that the program savings were 3,599 kW for the program period.

Page 23: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 97

<UTILITY> does have two test reference cases, one from an August 2007 event using a 30% TrueCycle, and the other an August 2006 event at 50% TrueCycle. The three events called in 2011 used a 30% cycle. Thus, not withstanding other variables on event days (temperature, humidity, etc.), demand savings attributed to the program should have been estimated at 0.74 per kW per home, the results from the last 30% TrueCycle event.

Table 100: August 2007 event day at 30% TrueCycle30

Hour Ending Reduction Hour Ending Recovery 3:00PM -0.73 7:00PM 0.10 4:00PM -0.68 8:00PM 0.29 5:00PM -0.73 9:00PM 0.27 6:00PM -0.81 10:00PM 0.14 Average -0.74 Total 0.23

The August 2006 event, using a 50% TrueCycle did yield an average of 1.05 kW per home.

Table 101: 50% Duty Cycle August 2, 2006 Results (Hourly kW per Device) (Reduction is average of 4-6 PM)31

Hour Ending Reduction Hour Ending Recovery 3:00 PM -0.81 7:00 PM 0.31 4:00 PM -1.12 8:00 PM 0.46 5:00 PM -1.08 9:00 PM 0.36 6:00 PM -0.96 10:00 PM 0.34 Average -1.05 Total 0.37

We further looked at other programs across the country. Load estimates are highly dependent on mult<UTILITY>e variables within a particular program design, temperature, and customer type and cannot be necessarily extrapolated. However, we reviewed the PG&E SmartAC program32 which is also comprised of load switches and have a significant enrolled population of over 150,000 residential customer and almost 6,000 small commercial ones. Within this population, a 50% duty cycle yield an average load of 0.5 kW/home, whereas a 30% duty cycle yield an average of 0.33 load of kW per small business customer for the same program period. Whereas there are inherent differences between residential and small commercial loads the AC units sizes were similar in size between these two populations and they were in geographically similar areas and exposed to the same external factors (e.g., temperature, humidity, etc.) in the same event days. In summary, increasing the duty cycle to 50% would likely yield a higher observed load impact for the <UTILITY> participating homes. Recommendation: Energy and demand savings estimates for a given program year should be based on the best estimated value that reflects similar conditions as those called for a given event (e.g., duty cycle, temperature/humidity index, location, day of week, etc.), where those savings estimates exist. For example, for program year 2011, the best estimated impacts would have been 2007 event results given the 30% duty cycle used. For future estimates, 2011 impact estimates should be used provided that future events are called using a 30% cycle.

30 Source: <UTILITY> based on impact estimates of 2007 AC Cycling events 31 Source: <UTILITY> based on impact estimates of 2006 AC Cycling events 32 PG&E’s DR Program 2012 Season Reflections, presented at Peak Load Management Conference, November 2012

Page 24: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 98

5. Given the technology selected (one-way load control receiver), there is some difficulty in estimating the actual number of active program participants

The load control receiver receives one-way signals; as such, it is not feasible to determine whether receivers remain in place, and/or are operable. The program has been in place for a decade, and as such, there is an expected degradation on the enrolled population due to potential replacement of AC units or device malfunction. Through a QA/QC process, both <IMPLEMENTER> and <UTILITY> do sometimes identify devices that are no longer working properly. Often these devices have been disabled by an AC repair contractor when servicing the unit for an unrelated problem. Each device does have a prominent message printed on it, asking any technician servicing the AC unit to call <IMPLEMENTER> before removing or disabling the device, but contractors often ignore this.

If customers move, <UTILITY> does have procedures to keep the load control receiver installed, and to request the new customer to opt-out within 30 days of moving in if they desire to do so.

Still, <UTILITY> does not have visibility on the actual active enrollment levels associated with the program, and this creates a challenge in estimating impacts for the entire enrollment population when there is uncertainty on what the actual number is.

<IMPLEMENTER> and <UTILITY> plan to address this issue by using two different quality control procedures: an on-site audit program, and the Virtual Visit maintenance program.

Audit Program

<IMPLEMENTER> visits 10% of active participants, stratified by the length of time the device has been active in the field. If a device is found to be faulty, it is tracked and replaced. A very small percentage of devices are found to be faulty; issues include incorrect installation, incorrect serial numbers, defective hardware, and tampering by an AC repair contractor. <IMPLEMENTER> tracks devices that need service, reasons for failure, and statistical accuracy rates. This process is required by the <IMPLEMENTER> service contract with <UTILITY>.

Virtual Visit Maintenance Program

<UTILITY> manages the Virtual Visit program. Using their automated meter reading system (AMR), <UTILITY> will capture usage during a pre-arranged test day, or during an actual event. A test period is arranged, in which the device is activated for 1 hour (cycling usage by 30%). The AMR then obtains meter usage data for an hour prior to the test period, the hour of the test period, and the hour after the test period. The meter data allows <UTILITY> to identify devices that are working, faulty, or free riders. This helps <UTILITY> calculate likely lost load reduction due to faulty devices. A list of devices that are likely to be faulty is compiled and given to <IMPLEMENTER>, who schedules a site visit for each. In most cases, the circuit board of the device can be replaced, which also allows <IMPLEMENTER> to upgrade the device to the latest version of hardware and software.

Customer Service Calls

<IMPLEMENTER> regularly receives calls from participants who believe that their AC is not functioning correctly due to an event being called or due to the device itself. However, since the program has not called an event since 2007, and in 2011 only three were called, most of these calls occur on days in which the customer’s AC is not being cycled.

Recommendation: The number of known active participants should be flagged in the tracking database. Regular assessment of the remaining population should be done to ensure that treatment sample customers are representative of the entire active population (in terms of geography/zip

Page 25: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 99

code, housing profile, etc.). Total estimated active participant population should be used for attribution of energy and demand savings.

6. Given program saturation, marketing effectiveness is likely to experience diminishing returns

The ACLM program has been in place for more than a decade, and has enrolled more than 36,000 customers. <UTILITY> has estimated that the market will become saturated at 50,000 participants. Thus, it is necessarily becoming more difficult to recruit new participants. The current marketing campaign aims at reaching potential participants through indirect channels such as the “Cool Schools” and the “Give Thanks” campaigns, as well as continuing with direct mail efforts.

Most of the marketing efforts in 2011 took place in August and September and included:

Middle School Program (Goal – 8 to 10 schools) - “Cool Schools” Soup Kitchens – Bill stuffer and email campaign - “Give Thanks” Email Campaigns - “Green Pays. And that’s Cool.” Direct Mail - “Green Pays. And that’s Cool.”

School Program – “Cool Schools”

The goal of this portion of the campaign is to convince 8 to 10 schools to participate. <UTILITY> has calculated that the average Indiana middle school has approximately 400 students, so eight participating schools have the potential to reach 3,200 households. Schools were recruited through phone calls, letters, and flyers.

Each participating school received a starter packet containing program information and class materials, as well as a mid-program follow up check-in.

Soup Kitchens – “Give Thanks”

Through bill inserts, postcards, and email blasts, customers were encouraged to join CoolCents, and were informed that a $20 donation would be made in their honor to a local soup kitchen. In 2009, <UTILITY> worked with Gleaners Food Bank of Indiana Inc., though the evaluation team did not receive updated information as to the current partnership.

Email Campaigns - Various

Three different email blasts were sent, with different themes (October - Soup Kitchen, November - Soup Kitchen, December - Holiday Push).

Direct Mail - “Green Pays. And that’s Cool.”

A large-format postcard on glossy card stock was sent to customers. These mailers concentrated on messaging that <UTILITY> has judged to be effective in previous years of the program, combining a “green” environmental message with financial incentives. They also announced a $2,500 giveaway that customers could become eligible for by enrolling in the ACLM program.

Due to the increasing saturation of the market, <IMPLEMENTER> has said that it will increase marketing efforts in 2012 in an effort to continue to meet program goals.

Recommendation: Going forward, track enrollees by the marketing outreach format to refine marketing and outreach strategies. <UTILITY> should assess the saturation rate of participation in the desired target population (single-family homes and commercial premises, with average to higher than average energy consumption loads with an AC unit) to assess probability of reaching its original

Page 26: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 100

targets of 50,000 units without compromising program cost-effectiveness or significantly decreasing observed load from less optimal populations. Should <UTILITY> determine the market saturation for program participation, it should revisit its marketing strategy given this information, and/or revise the program total enrollment goals.

7. Inclusion of multifamily units to meet marketing goals will likely lead to overall lower expected load per participating premise

<UTILITY> is considering changing their marketing to include multifamily units to address saturation of the single-family market. Multifamily sites often do not use as much AC related electricity as stand-alone homes, and so the potential for load management is smaller on a per-unit basis. However, it is possible to install many devices on the same site, reducing marketing and site-visit costs.

Recommendation: <UTILITY> should do a study of expected load from multifamily premises and the impact on cost-effectiveness if it plans to actively shift its marketing strategies to target this residential segment. Typically, an average multifamily home has lower consumption than an average single-family home and thus yields less load reduction during an event. AC units are typically undersized, and actual consumption is also highly dependent on the orientation of the home within a complex. Thus, it is likely that load shed by this segment in an event would be lower than that observed in an average single-family home or commercial premise. This has potential impacts on program cost-effectiveness and it will also require stratification of the sample population to ensure that load impacts for multifamily are understood separately from those from residential single-family and commercial premises.

8. Data from the sample group used for impact evaluation is incomplete

Interval smart meter data was collected for the treatment population. These meters measure energy consumption at the house level. As such, null values on the data are not expected, because typically even when the customer is away, some appliances tend to be plugged-in, such as refrigerators, as well as plug load. However, in various instances, null values were present in the data.

Table 102 provides an overview of the missing sample data. Notably, there were two types of data of concern. One was missing data (where no information was available); the other was data that were zeroes.

Table 102: Missing Sample Data

Observations Unique Participants

Number  Percentage of all (n=175,896) 

Number (mult<UTILITY>e

response) 

Percentage of all (n=67) 

Zero 2,352 1.3% 16 23.9%

Missing 358 0.2% 21 31.3%

Total 2,710 1.5% 32 47.8%

For purposes of the analysis, these intervals were considered as invalid data and excluded from the analysis, as they were skewing observed load consumption during non-event days for the participant sample and, as such, were putting downward pressure on regression coefficients for load. During event days, there were no observed null values.

Recommendation: Accuracy and completeness of smart meter data should be established for treatment groups to ensure adequate data for impact analysis.

Page 27: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 101

9. If program year 2012 evaluation is needed, it would be important to undergo activities to ensure availability of data for evaluation purposes

The sample design review will reveal whether the current sample is statistically representative of the sample population. If not, a different treatment group may need to be selected. If data collection strategies are dependent on installation of a smart meter at treatment locations, then these must happen prior to ACLM events being called in 2012.

Additionally, participant surveys are an informative approach to collect information on customer awareness of events when they are called, as well as to understand customer behavior during peak periods that can inform the ACLM program in terms of what duty-cycle to use, when to call events, and how to maximize observed load shed. Typically, these would be fielded on a rolling basis following ACLM events. If this were to apply for the 2012 summer, the survey instruments need to be drafted and approved prior to events being called, which requires that evaluation activities for PY2 occur immediately.

Recommendation: Should <UTILITY> plan on conducting evaluation activities for the ACLM Program for the 2012 program period, it should consider doing the following activities: (1) draft and field surveys to program participants following AC Cycling events, should those be called in 2012, and (2) establish a sample treatment group that is representative of the entire treatment population and that includes small commercial customers by installing smart meters or other data collection devices for this population prior to calling 2012 ACLM events.

10. Consider calling events using economic triggers

Calling events during periods of high prices can have a marked effect on prices given the spiky nature of Locational Marginal Pricing (LMP). Even small reductions in overall demand can lower overall market prices, which are then applicable to every kWh used in the service territory, not simply the premises with load reduction. Thus, savings will be realized which can indirectly be shared across all customer classes.

Figure 19: Load Management Savings Diagram

Page 28: Air Conditioning Load Management Impact and Process Report · Air Conditioning Load Management Impact and Process Report -UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx

Air Conditioning Load Management Impact and Process Report

-UTILITY- Residential Core Plus Programs EMV Report PY1_2011_12_12_FINAL.docx Page 102

A review of the Locational Marginal Prices in Indiana in 2011 indicated that there is significant volatility in LMP in Northern Indiana, especially during the program season of June through September.

Figure 20: Northern Indiana LMP

The average LMP price hovers around $30.00. Yet, for 7% of the period, LMP exceeded the average by 150%, and for 2% of the period, or around 150-160 hours, the wholesale market exceeded 200% of LMP.

Table 103: Percentage and Number of Hours over LMP in Northern Indiana, 2009-2011 2009 2010 2011 Average Price $29.26 $31.54 $31.02 # of hours over average 2,912 2,099 2,234

% Exceeding Average Price 33% 24% 26%

150% of Average Price $43.89 $47.30 $46.53 # of hours exceeding 150% of Average Price 400 639 612

% exceeding 150% of Average Price 5% 7% 7%

200% of Average Price $58.51 $63.07 $62.04 # of hours exceeding 200% of Average Price 93 160 155

% exceeding 200% of Average Price 1% 2% 2%

Recommendation: <UTILITY> may want to consider calling cycling events based on economic triggers. These events could be called in days that are not coincident with system emergencies if <UTILITY> wants to monetize the value of the program in addition to achieving overall system reductions. (We note that typically, high peak periods tend to result in high wholesale energy prices).