Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
H Gil Peach & Associates
Verification Review of Program Year 2016 Evaluation Results Report for the Nova Scotia Utilities and Review Board
H. Gil Peach, Ph.D.
John Mitchell, B.S.
Mark Thompson, M.S.
C. Eric Bonnyman, M.S., P.Eng.
July 17, 2017
ii
Vision Statement
To be a world leader in developing truthful measurement and useful results; to support
development of efficient, ethical, and effective practices, sustained economically; to
advance human development.
Goals Statement
• Excellence in the integration of knowledge, method, and practice
• Improvement and learning at all levels
• Contextually sound measurement, analysis, and reporting
• Anticipate and meet the needs of our clients
• Awareness of human relevance and of the ethical core of research
• To go further, to find better ways
Mission Statement
With extensive experience in North America, together we can provide the full range of
management, planning, and evaluation services – wherever and whenever there is a
need.
Climate Adaptation Statement
Improve the Quality of Life during the Era of Climate Change
H Gil Peach and Associates LLC
& Adapt.Global Inc. Website www.scanamerica.net
www.adapt.global
www.peachandassociates.com
www.peachandassociates.net
H. Gil Peach & Associates LLC
H. Gil Peach, Ph.D.
16232 NW Oak Hills Drive Beaverton, Oregon 97006-5242, USA
Telephone: (503) 645-0716 EIN: 11-3783390 (Revised 6/30/2006) Fax: (503) 946-3064
iii
Verification Review
Of Program Year 2016 Evaluation Results
iv
Table of Contents
I. Overview ................................................................................................................... 1
II. Demand-Side Management Cycle ............................................................................ 2
A. Planning ................................................................................................................. 3
B. Implementation ...................................................................................................... 3
C. Evaluation .............................................................................................................. 4
D. Savings Verification ............................................................................................... 5
III. First Year Results – Net at Generator ....................................................................... 5
A. All Evaluated Results ............................................................................................. 5
B. ENS Portfolio Evaluated Results ........................................................................... 7
1) ENS Residential Portfolio ....................................................................................... 8
2) ENS Business, Non-Profit & Institutional Portfolio (BNI) ........................................ 9
C. Codes & Standards Evaluated Results ................................................................ 11
D. Percentage Contribution of Individual ENS Programs ......................................... 14
IV. The Evaluation Plan ................................................................................................ 16
V. The 2016 Evaluations ............................................................................................. 18
VI. Savings Verification Review of the 2016 Evaluations .............................................. 19
VII. General Findings ................................................................................................. 20
VIII. General Recommendations ................................................................................. 22
IX. General Considerations .......................................................................................... 24
X. Individual Program Component Review .................................................................. 25
1) Residential Efficient Product Rebates Program ................................................... 25
2) HomeWarming Program ...................................................................................... 27
3) Instant Savings Evaluation ................................................................................... 28
4) Home Energy Assessment .................................................................................. 30
5) Green Heat .......................................................................................................... 32
6) Residential Direct Install ...................................................................................... 36
7) Rental Properties and Condos Service ................................................................ 37
8) New Home Construction Program ....................................................................... 38
v
9) Business Energy Rebates .................................................................................... 40
10) Custom Program .............................................................................................. 43
11) Energy Management Information Systems (EMIS) ........................................... 46
12) Strategic Energy Management (SEM) .............................................................. 50
13) Small Business Energy Solutions ..................................................................... 51
14) Regulation: Codes and Standards ................................................................... 54
XI. Looking toward the Future of Evaluation ................................................................. 62
XII. Summary of Findings ........................................................................................... 69
XIII. Summary of General Recommendations ............................................................. 69
XIV. Summary of General Considerations ................................................................... 70
XV. Summary of Individual Program Area Recommendations ................................... 71
XVI. References .......................................................................................................... 74
Figures
Figure 1: Continuous Improvement: The DSM Cycle. .................................................................................... 2
Figure 2: Percent Savings at Generator - ENS and MEPS. ............................................................................ 6
Figure 3: Percent Demand Reduction at Generator - ENS vs. MEPs. ............................................................ 6
Figure 4: ENS Residential vs. BNI - Percent Energy Savings. ........................................................................ 7
Figure 5: ENS Residential vs. BNI - Percent Demand Reduction. .................................................................. 7
Figure 6: Percent Energy Savings: ENS Residential by Group. ..................................................................... 9
Figure 7: Percent Demand Reduction: ENS Residential by Group. ................................................................ 9
Figure 8: Percent Energy Savings: ENS BNI by Group. ............................................................................... 11
Figure 9: Percent Demand Reduction - ENS BNI by Group. ......................................................................... 11
Figure 10: MEPs - Net Energy Savings. ........................................................................................................ 13
Figure 11: MEPs - Net Demand Reduction. ................................................................................................... 13
Figure 12: ENS Portfolio: Energy Savings by Program. ............................................................................... 14
Figure 13: ENS Portfolio: Demand Reduction by Program. ........................................................................... 15
Figure 14: Methodology Employed by Econoler for MEPs. ........................................................................... 57
Figure 15: Ideal-Typical S-curve. ................................................................................................................... 65
Figure 16: S-Curve. (NEEA) ........................................................................................................................... 65
Figure 17: Iterative Sequence of Transformation Steps. (NEEA) .................................................................. 66
Figure 18: Benefits captured in Market Transformation Evaluation. .............................................................. 67
vi
Tables
Table 1: 2016 First Year Energy Net Impact Evaluation Results. .................................................................... 5
Table 2: ENS Residential Groups & Programs. ............................................................................................... 8
Table 3: Residential Energy Savings by Group. .............................................................................................. 8
Table 4: Residential Demand Reduction by Group. ......................................................................................... 8
Table 5: ENS BNI Programs by Group. ......................................................................................................... 10
Table 6: ENS BNI Energy Savings by Group. ................................................................................................ 10
Table 7: ENS BNI Demand Reduction by Group. .......................................................................................... 10
Table 8: Minimum Energy Performance (MEPs) Groups............................................................................... 12
Table 9: MEPs Energy Savings by Group. .................................................................................................... 12
Table 10: MEPs Demand Reduction by Group. ............................................................................................. 12
Table 11: Three-Year Impact Evaluation Plan. ............................................................................................... 16
Table 12: Three-Year Process Evaluation Plan. ............................................................................................ 16
Table 13: Three-Year Market Evaluation Plan. .............................................................................................. 17
Table 14: List of 2016 Evaluations. ................................................................................................................ 19
Towards contextually sound measurement, analysis, and
reporting…
I. Overview
This report is a savings verification review conducted by H. Gil Peach & Associates,
LLC for the Nova Scotia Utility and Review Board (Board). It reports on verification of
electricity energy savings and demand reduction estimated by Econoler, the
Independent Evaluator for the Demand Side Management Administrator’s (Efficiency
Nova Scotia’s) calendar 2016 energy savings and demand reduction programs.
As in previous years, the primary purpose of this report is to review, verify, and if
appropriate, recommend adjustments to the savings data and estimates created by
evaluation consultants engaged by the Demand Side Management Administrator
(DSM Administrator).1 For this reason the verification review is focused on impact
evaluation. We also provide limited comments on other parts of the evaluation and
suggestions for the future.
This examination is based on careful review of the evaluation for each program
component or initiative in the DSM portfolio, including review of each evaluation’s
explicit or implicit design, of evaluation methods and standards, and/or evaluation
protocols used in each evaluation. It is also based on review of the program data
tracking system, on discussions with staff and delivery agents concerning program
procedures and methods for program Quality Assurance/Quality Control (QA/QC)
and selected “due diligence” site visits to check installation counts and quality of
work. In addition, the savings verification consultant watches the evolving general
standards of practice in DSM and DSM evaluation to highlight “better practices”.
1 Nova Scotia Power, Inc. served as the DSM Administrator for calendar years 2008 and 2009.
Responsibility transferred to Efficiency Nova Scotia Corporation in October 2010. Calendar year 2011
represented the first full program cycle with Efficiency Nova Scotia as the DSM Administrator.
Efficiency Nova Scotia (ENS) is now classified as an independent utility and is administered by E1. As
a rule of thumb, after the fifth year a major effort such as this is considered mature. This report covers
evaluation of a mature program portfolio implemented by a seasoned staff.
The primary purpose of this report is to review, verify, and if appropriate,
recommend adjustments to the savings data and estimates created by
the independent evaluation consultants engaged by the DSM
administrator.
Verification Review of Program Year 2016 Evaluation Results
2 | P a g e
II. Demand-Side Management Cycle
Most readers of this study will have several years of experience with the yearly
cycles of DSM effort in Nova Scotia. For a new reader, for orientation, probably the
most important thing is to see DSM as a cycle of repeated activities with continuous
process improvement. This feature of continuous improvement through iterations of
implementation experience and planning is central to the nature of energy efficiency
work. What is shown here is how organizational learning operates. Figure 1 shows
the primary steps of the iterative DSM cycle.2
Figure 1: Continuous Improvement: The DSM Cycle.
The arrows in Figure 1 show the high-level flow of work within a DSM cycle.
Implementation, Evaluation and Savings Verification each feed information forward to
2 Other steps (not shown) are periodic potential studies, benefit/cost testing and special studies.
Verification Review of Program Year 2016 Evaluation Results
3 | P a g e
the Planning step. Planning proceeds continuously. In addition, every three to five
years, depending on the regulatory case schedule there is an intensified major
planning effort that the pattern for Implementation for the next set of years. As noted
above, the focus of the cycle as whole is on continuous improvement.3 This report for
Calendar year 2016 covers the ninth program year from the 2008 start and the sixth
full year with an independent energy efficiency and conservation administrator,
currently Efficiency Nova Scotia (ENS).
A. Planning
The function of the Plan is to develop program plans and to provide estimates. The
Plan provides a high-level blueprint for Implementation in the form of a program
portfolio, the results of benefit/cost tests and specific program plans. Planning
estimates of energy savings and demand reduction are ex-ante estimates. They may
be informed by the Technical Resource Manual (TRM), deemed values synthesized
from secondary sources, values from the tracking system from a previous cycle,
evaluation results from a previous cycle, and/or engineering modeling and
estimation. Sometimes program structure and expected performance data is taken
from the Consortium for Energy Efficiency (https://www.cee1.org/), an organization
for the exchange of information about energy efficiency programs to which Efficiency
Nova Scotia belongs; or from another organization that develops information on
energy efficiency program; or from a specific program in another jurisdiction.
B. Implementation
The Implementation step in each cycle feeds information forward to Planning through
the development of practical knowledge gained by program managers and delivery
agents. Implementation is documented and supported by the program data tracking
system. Work in Implementation means being “at the edge”, encountering emergent
realities that require alertness and sometimes quick adaptation based on new
information. In this way, the programs are adapted in real time from planning
perspectives to field realities. In field experience, situations, forces and limits emerge
3 Morse, William L. & H. Gil Peach, “Control Concepts in Conservation Supply,” Energy, Vol. 14, No.
11, Pp. 727-735, 1989; Gellings, Clark W. & John H. Chamberlin, Demand-Side Management
Planning Concepts & Methods, Second Edition, Liburn Georgia: The Fairmont Press, 1992; Gellings,
Clark W. & John H. Chamberlin, Demand-Side Management Planning, Liburn, Georgia: The Fairmont
Press, 1993.
Verification Review of Program Year 2016 Evaluation Results
4 | P a g e
that cannot be anticipated.4 Each of the other steps in the DSM cycle feeds in
information to help Implementation.
C. Evaluation
The Evaluation step produces independent results assessment in a yearly formal
evaluation report. However, it should also be feeding timely information back to the
Implementation step to provide opportunity for in-progress program corrections
(without waiting for the full evaluation report).5
Evaluation produces results of independent measurements of auxiliary variables and
results from observations, site inspections, surveys, interviews, literature reviews and
analyses. Results are based on, for example, engineering and statistical analysis of
electrical measurements, hours of operation, interaction effects, the assessment of
free-riders and spillover effects and the development of net-to-gross ratios).
Evaluation includes review of baseline(s) along with both gross and net impact
results. In recent years, impact evaluations of individual programs within the overall
program portfolio are expected, where possible, to be structured in reference to
selection of an evaluation protocol (while reserving professional decisions to the
independent evaluator).
In practice, DSM evaluation generally operates from a resource acquisition model,
combining impact evaluation with market evaluation. It is also possible to conduct an
evaluation from a market transformation perspective. And, it is becoming relevant to
include climate adaptation perspectives in selected evaluations.
Evaluation also includes process evaluations, designed to improve the efficiency,
effectiveness, community service orientation and appropriateness of program
processes. Evaluation may also include policy evaluations.
4 For an early article on this point, see Keating, Kenneth M., Ruth L. Love, Terry V. Oliver, H. Gil
Peach & Cynthia B. Flynn, “The Hood River Project; Take a Walk on the Applied Side,” Pp. 112-118,
The Rural Sociologist, Vol.5, No. 2, May 1985.
5 The focus is on continuous improvement.
Verification Review of Program Year 2016 Evaluation Results
5 | P a g e
D. Savings Verification
Both the Implementation step and the Evaluation step feed information forward to
Savings Verification. The fed forward includes program documents and tracking
records, electrical measurements, analysis plans and methods employed, relevant
evaluation protocols and results presented in evaluation reports. Savings Verification
confirms and, when necessary, questions the Evaluation data, methods and reported
results.
III. First Year Results – Net at Generator
For 2016, Efficiency Nova Scotia (ENS) administered a residential program portfolio
plus a business, non-profit and institutional (BNI) portfolio. Econoler presents
evaluations of the programs. Also, outside of the ENS, in Codes and Standards,
Econoler evaluated five Minimum Efficiency Performance efforts (MEPs).
A. All Evaluated Results
Table 1: 2016 First Year Energy Net Impact Evaluation Results.
Table 1, Column 1 shows evaluated first year net energy savings at generator. The
evaluated demand reduction at the generator is shown in Column 2. Figures 1
through 4 (below) summarize data in the form of percentages. Results for MEPs are
treated independently from results for the DSM Administrator’s programs.
For ENS energy savings, Econoler reports a total of 136.863 GWh first year net
energy savings at the generator. For MEPs, Econoler reports 23.445 GWh. This is
Col. 1 Col. 2 Col. 3 Col. 4
GWh GWh (%) MW MW (%)
69.085 43% 15.197 49%
67.778 42% 10.773 35%
136.863 85% 25.970 84%
23.445 15% 5.044 16%
160.308 100% 31.014 100%
All 2016 Evaluated First-Year Results (at Generator)
Total (DSM Administrator + MEPs)
Source: Tables 5, 6 & 7 in Econoler, Corporate Research Associates & Equilibrium Engineering, Evaluation of
2016 DSM Programs, Executive Summary, Efficiency Nova Scotia, March 8, 2016, Pp. 10, 12 & 13.
DSM Administrator Program Portfolio
Residential Portfolio
Business, Non-Profit & Institutional Portfolio
Codes & Standards (MEPs)
Total, DSM Administrator
Verification Review of Program Year 2016 Evaluation Results
6 | P a g e
equivalent to about 85% ENS and 15% MEPs (Figure 2). For demand reduction,
Econoler reports a 25.970 MW first year net peak demand reduction at the generator
for ENS. For MEPs, Econoler reports 5.044 MW. This is equivalent to about 84%
ENS and 16% MEPs (Figure 3).
Figure 2: Percent Savings at Generator - ENS and MEPS.
Figure 3: Percent Demand Reduction at Generator - ENS vs. MEPs.
85%
15%
2016 EVALUATED FIRST-YEAR NET ENERGY SAVINGS (AT GENERATOR)
ENS Portfolio MEPs
84%
16%
2016 EVALUATED FIRST-YEAR NET DEMAND REDUCTION
(AT GENERATOR)
ENS Portfolio MEPs
Verification Review of Program Year 2016 Evaluation Results
7 | P a g e
B. ENS Portfolio Evaluated Results
Considering only ENS, energy savings are split nearly 50/50 between the Business,
Non-Profit and Institutional (BNI)I Portfolio vs. the Residential Portfolio ( Figure
4), while demand reduction splits nearly 40% BNI and 60% Residential (Figure 5).
Figure 4: ENS Residential vs. BNI - Percent Energy Savings.
Figure 5: ENS Residential vs. BNI - Percent Demand Reduction.
50%50%
ENS 2016 EVALUATED FIRST YEAR NET ENERGY SAVINGS(AT GENERATOR)
Residential BNI Portfolio
Verification Review of Program Year 2016 Evaluation Results
8 | P a g e
1) ENS Residential Portfolio
Residential programs are structured within three groups (Table 2). Residential
energy savings by group and residential demand reduction by group are shown in
Table 3 and Table 4. Percentage results are shown in Figure 6 and Figure 7.
Table 2: ENS Residential Groups & Programs.
Table 3: Residential Energy Savings by Group.
Table 4: Residential Demand Reduction by Group.
Residential Group GWh GWh (%)
Product Rebates 37.315 60%
Existing Residential 20.862 34%
New Residential 3.908 6%
62.085 100%
Energy Savings: ENS Residential Portfolio
(Generator)
Residential Group MW MW (%)
Product Rebates 5.935 42%
Existing Residential 7.002 50%
New Residential 1.118 8%
14.055 100%
Demand Reduction: ENS Residential Portfolio
(Generator)
Verification Review of Program Year 2016 Evaluation Results
9 | P a g e
Figure 6: Percent Energy Savings: ENS Residential by Group.
Figure 7: Percent Demand Reduction: ENS Residential by Group.
2) ENS Business, Non-Profit & Institutional Portfolio (BNI)
There are three groups of Business, Non-Profit and Institutional (BNI) programs
(Table 5). Energy savings and demand reduction are shown in Table 6 and Table 7;
Verification Review of Program Year 2016 Evaluation Results
10 | P a g e
and as percentages in Figure 8 and Figure 9. .
Table 5: ENS BNI Programs by Group.
Table 6: ENS BNI Energy Savings by Group.
Table 7: ENS BNI Demand Reduction by Group.
Portfolio DSM Program
Business Energy Rebates
Custom
Energy Managemnt IS
Strategic Energy Management
Small Bus. Energy Solutions
Business,
Non-Profit &
Institutional
Efficient Products Rebates
Custom Incentives
Direct Installation
Group
ENS BNI Groups & Programs
Portfolio DSM Program GWh GWh (%)
Efficient Products Rebates 37.797 56%
Custom 25.858 38%
Direct Installation 4.123 6%
67.778 100%
ENS BNI Energy Savings by Group
(Generator)
Business,
Non-Profit &
Institutional
Total for BNI Portfolio
Portfolio DSM Program MW MW(%)
Efficient Products Rebates 7.094 66%
Custom 2.609 24%
Direct Installation 1.07 10%
10.773 100%
ENS BNI Reduced Demand by Group
(Generator)
Business,
Non-Profit &
Institutional
Total for BNI Portfolio
Verification Review of Program Year 2016 Evaluation Results
11 | P a g e
Figure 8: Percent Energy Savings: ENS BNI by Group.
Figure 9: Percent Demand Reduction - ENS BNI by Group.
C. Codes & Standards Evaluated Results
Five groups of Minimum Energy Performance standards (MEPs) were evaluated
(Table 8). Energy savings and demand reduction results are shown in Table 9 and
Table 10; percentages are shown in Figure 10 and Figure 11.
56%38%
6%
PERCENT ENERGY SAVINGS: ENS BNI PORTFOLIO BY GROUP
Efficient Products Custom Incentives Direct Installation
66%
24%
10%
PERCENT DEMAND REDUCTION: ENS BNI PORTFOLIO BY GROUP
Efficient Products Custom Incentives Direct Installation
Verification Review of Program Year 2016 Evaluation Results
12 | P a g e
Table 8: Minimum Energy Performance (MEPs) Groups.
Table 9: MEPs Energy Savings by Group.
Table 10: MEPs Demand Reduction by Group.
New Energy Code for Large Buildings, Condos and Apartments
New Energy Code for Houses and Small Buildings
LED Street-lighting
Old Light Bulbs (40W & 60W)
Old Light Bulbs (75W & 100W)
MEPs Groups
Group GWh GWh (%)
Old Light Bulbs (75W & 100W) 6.324 27%
Old Light Bulbs (40W & 60W) 8.435 36%
LED Street-lighting 5.049 22%
New Energy Code for Houses
and Small Buildings1.314 6%
New Energy Code for Large
Buildings, Condos and
Apartments
2.333 10%
Total for MEPs 23.455 100%
MEPs Energy Savings by Group
(Generator)
Group MW MW(%)
Old Light Bulbs (75W & 100W) 1.217 24%
Old Light Bulbs (40W & 60W) 1.624 32%
LED Street-lighting 1.257 25%
New Energy Code for Houses
and Small Buildings0.372 7%
New Energy Code for Large
Buildings, Condos and
Apartments
0.574 11%
Total for MEPs 5.044 100%
(Generator)
MEPs Demand Reduction by Group
Verification Review of Program Year 2016 Evaluation Results
13 | P a g e
Figure 10: MEPs - Net Energy Savings.
Figure 11: MEPs - Net Demand Reduction.
Verification Review of Program Year 2016 Evaluation Results
14 | P a g e
D. Percentage Contribution of Individual ENS Programs
As shown in Figure 12, four programs contributed approximately 80% of Efficiency
Nova Scotia energy savings: Business Energy Rebates, Instant Savings, Custom
and Residential Direct Install. Similarly, (Figure 13) five programs contributed about
80% of demand reduction: Business Energy Rebates, Instant Savings, Residential
Direct Install, Green Heat and Custom.6
Figure 12: ENS Portfolio: Energy Savings by Program.
6 Measurement is at the generator.
Verification Review of Program Year 2016 Evaluation Results
15 | P a g e
Figure 13: ENS Portfolio: Demand Reduction by Program.
Resource acquisition is analyzed as first-year energy savings and first-year demand
reduction (at the generator). Currently, Nova Scotia evaluations are required to
produce the level of results shown in Table 1 through Table 10 and in Figure 2
through Figure 11. The analysis and presentation is representative of a resource
acquisition approach to program evaluation. For these reasons, the approach to
analysis and the presentation format of the evaluation is fully adequate and complete
for Program Year 2016.
Verification Review of Program Year 2016 Evaluation Results
16 | P a g e
IV. The Evaluation Plan
For program year 2016 the evaluator conducted full impact and market evaluations,
plus process evaluations for three programs. These three process evaluations are to
be continued for 2017 and 2018. Impact evaluation is to be reduced for program
years 2017 and 2018, in a planned pattern. Market evaluation has been restricted to
seven programs for 2018. These are shown in Table 11 through Table 13.
Table 11: Three-Year Impact Evaluation Plan.
Table 12: Three-Year Process Evaluation Plan.
2016 2017 2018
Appliance Retirement Y Reduced Y
Instant Savings Y Reduced Y
Home Energy Assessment Y Y Reduced
Green Heat Y Y Reduced
Residential Direct Install Y Y Reduced
Rental Properties and Condos Y Y Reduced
New Home Construction Y Y Reduced
Business Energy Rebates Y Reduced Y
Custom Y Y Y
Energy Management Information
SystemsY Y Y
Strategic Energy Management Y Y Y
Small Business Energy Solutions Y Y Y
Impact EvaluationProgram Component
2016 2017 2018
Appliance Retirement N N N
Instant Savings N N N
Home Energy Assessment N N N
Green Heat N N N
Residential Direct Install N N N
Rental Properties and Condos N N N
New Home Construction N N N
Business Energy Rebates N N N
Custom N N N
Energy Management Information
SystemsY Y Y
Strategic Energy Management Y Y Y
Small Business Energy Solutions Y Y Y
Program ComponentProcess Evaluation
Verification Review of Program Year 2016 Evaluation Results
17 | P a g e
Table 13: Three-Year Market Evaluation Plan.
When programs are initially implemented and for the first set of years, they need both
an impact evaluation and a process evaluation. Since DSM programs are keyed to
markets, a continuing market evaluation is also important. Sometimes market
evaluations can contain process elements and process evaluations can contain
market elements. When there is a planned reduction of evaluation effort as shown in
Table 11, Table 12 and Table 13, the reduction is typically the result of the need to
restrict budget and a general sense that programs have become mature. Generally,
a program can be considered mature after five implementation years. Here, for
example, Energy Management Information Systems (EMIS) and Strategic Energy
Management (SEM) are not yet mature programs. The Small Business Energy
Solutions program was substantially changed from the prior small business program,
so could be regarded as not mature.
• Impact evaluation: The yearly impact evaluation is essential for new
programs. After the fifth program year program, particularly for some program
types for which evaluation is dependent on modeling using unitary values,
impacts become known. The confidence developed through a series of yearly
evaluations and the dependence of the impact evaluations on unitary values
can suggest a reduced impact evaluation effort. For example, the three
impact evaluations to be reduced for program year 2017 are of this kind, but
the five evaluations to be reduced in 2018 are not.
• Process evaluation: Eventually, program process becomes well established
through years of operation and well documented through a series of process
2016 2017 2018
Appliance Retirement Y Y Y
Instant Savings Y Y Y
Home Energy Assessment Y Y N
Green Heat Y Y N
Residential Direct Install Y Y N
Rental Properties and Condos Y Y N
New Home Construction Y Y N
Business Energy Rebates Y Y Y
Custom Y Y Y
Energy Management Information
SystemsY Y Y
Strategic Energy Management Y Y Y
Small Business Energy Solutions Y Y Y
Program ComponentMarket Evaluation
Verification Review of Program Year 2016 Evaluation Results
18 | P a g e
evaluations. If the program is stable; if administration continues in place; if
there is not staff turnover and if program delivery vendor and supply
relationships continue in place, then yearly process evaluations may become
redundant. It then makes sense to focus a smaller process evaluation
resource on any newly evolved programs and on any programs in which
problems are experienced.
However, because mature organizations can experience drift as staff turns
over and as a complex organizational environment continues to change7,
process evaluation should recur at some fixed interval. In the intervening
years, program quality control effort should be continued and increased.
Quality control results may call attention to a program which requires a full
process evaluation.
• Market Evaluation: Since DSM programs are strategically directed per
market tracking, it is difficult to skip a market evaluation. However, it may be
that some programs have markets that are known and stable so that market
evaluations could be done every other year.
An alternative approach is to run a continuing impact evaluation with a sample each
year that cumulates to the number of sample points capable of producing 90%
confidence and 10% precision (or some other standard) every second or third year.
This would maintain the confidence and precision standard while providing for
continual participation and presence of the evaluator.
V. The 2016 Evaluations
Econoler and its associated team (Research Into Action, Equilibrium Engineering
and Corporate Research Associates) evaluated the 2016 programs. The evaluator
conducted and reported on evaluations of six program groups, consisting of
thirteen individual Efficiency Nova Scotia programs for 2016 (Table 14). In
addition, a group of Minimum Energy Performance programs (MEPs) were
evaluated. These Codes and Standards programs are not included within
Efficiency Nova Scotia’s program portfolio.
7 Dekker, Sidney, Drift Into Failure, From Hunting Broken Components to Understanding Complex
Systems. Surry, England & Burlington, Vermont: Ashgate, 2011.
Verification Review of Program Year 2016 Evaluation Results
19 | P a g e
Table 14: List of 2016 Evaluations.
VI. Savings Verification Review of the 2016 Evaluations
In the Savings Verification study, we focus on program impacts (GWh and MW).
However, the evaluations also include market evaluations and three evaluations
also incorporate process evaluations.
The savings verification review was conducted as follows:
• During the year, we conducted a set of due diligence site visits for each
program. These provide an independent view of each program and of any
obvious problems. The site visits also provide opportunity to call Efficiency
Nova Scotia’s attention to anything that needs to be explained or potentially
fixed. We keep the Nova Scotia Utility and Review Board aware of any
problems encountered.
Verification Review of Program Year 2016 Evaluation Results
20 | P a g e
• We discussed use of the tracking system with Efficiency Nova Scotia. This
provides an independent sense of how the tracking system is working.
• In review of the evaluations, we focused on the “installed” annual energy
savings and demand reductions. [These are the annualized value of
savings and demand reductions from the measures installed, regardless of
what day of the year they were put into service. They do not represent the
“realized” savings for the calendar year of the program, but are the savings
that should be realized when each of the projects operates for a full year.]
These numbers come from the tracking system and from the evaluator’s
measurements, calculation, modeling and adjustments.
• Generally, we did not check the evaluator’s mathematical calculations,
but did carefully check the evaluator’s presentation of method for each
program analysis. We reviewed the interaction, free ridership, spillover
and net-to-gross approaches. In a few cases, we did check the math.
We endorse certain of the evaluator’s recommendations.8 Individual findings
and recommendations are also provided. Recommendations are summarized
at the end of the report.
VII. General Findings
There are five primary findings:
8 Evaluator recommendations not explicitly endorsed are implicitly endorsed. Any exceptions are
treated explicitly by DSM Component in the discussion on recommendations.
Finding 2016 General-SV-1: The evaluator’s approaches and analysis for all
evaluations in Program Year 2016 are fully adequate and complete.
Finding 2016 General-SV-2: The evaluator’s presentation format of the Program
Year 2016 evaluation is fully adequate and complete.
Finding 2016 General-SV-3: All the 2016 evaluations are within accepted
industry protocols or a relevant evaluation framework where a protocol does not
exist.
Verification Review of Program Year 2016 Evaluation Results
21 | P a g e
Each program impact evaluation is comprehensive: The structure and format of each
impact evaluation is consistent and useful. Each has an Executive Summary that
accurately reports the contents of the evaluation. Each provides key definitions of
terms and acronyms up-front. Each has a complete table of contents. Each has an
explanatory methodological model diagram, followed by specification and discussion
of methods employed. Citations to other studies throughout the evaluations are
relevant. For each evaluation for which it applies, there is an appendix with a careful
and clear delineation of free ridership calculation methods, interview instruments and
an example of the impact calculation method. These components make the
evaluations transparent and assist independent review of results.
Another finding is based on review of the evaluation plan.
A final finding is based on experience in site visits. When a program shifts from a
restricted and qualified list of approved vendors to the general market, ENS loses
some ability to insure quality.
General recommendations that apply throughout the evaluation effort are listed
next. These are followed by considerations for future evaluations and
recommendations for the individual evaluations.
Finding 2016 General-SV-4: ENS has stopped doing process evaluation in nine
of twelve program areas (Table 12). For three years in a row these programs will
not have process evaluations. ENS plans to drop five of twelve market
evaluations for the 2018 program year (Table 13). ENS plans to use reduced
effort for three of twelve programs for the 2017 program year and for five of twelve
programs for the 2018 program year (Table 11).
The market evaluation and impact evaluation reductions do not affect program
year 2016 evaluations. The 2016 evaluations are affected by having no process
evaluations.
Finding 2016 General-SV-5: When a program shifts from ENS control to
reliance upon markets, ENS loses some ability to insure quality.
Verification Review of Program Year 2016 Evaluation Results
22 | P a g e
VIII. General Recommendations
First, the Savings Verification study recommends acceptance of the 2016 evaluation
results for energy savings and for demand-reduction.
Second, we recommend inclusion of the then current overall evaluation plan in future
evaluations. Reporting the currently operative evaluation plan (shown in this Savings
Verification study as Tables 12–14 on Pp. 16-17) permits the reader to begin to focus
on the basis and relative rigor of evaluated results and permits assessment of the
planned flow of continuing evaluation effort.
We recommend, in addition, in the individual program write-ups, that dropped or
reduced evaluation components should be noted along with an explanation of why,
on balance, this is considered appropriate.
The risk in dropping or reducing components of the evaluation is the acceptance of a
risk in credibility of evaluation results. It is important to be transparent as to where,
when and why an evaluation component is dropped or reduced (both in the current
program year evaluation and as planned). It is understood that ENS has discussed
these changes with the Evaluator and with the Advisory Group. However, in a public
process these decisions should be made clear in the Evaluation report and the
rationale and an assessment of risk should be discussed in each instance.
To clarify the effects of different levels of rigor across evaluations, we recommend
Recommendation 2016 General-SV-1: The Savings Verification study
recommends acceptance of the 2016 evaluation results for energy savings and
for demand-reduction for all programs.
Recommendation 2016 General-SV-2: In future years, ENS should require the
Evaluator to Include a copy of the operative evaluation plan within the overall
Executive Summary, supplemented by a discussion of why, on balance, selected
dropping of evaluation components or introduction of reduced rigor is considered
appropriate.
In addition, in the individual program write-ups, dropped and reduced evaluation
components should be noted and the rationale for dropping or reducing these
evaluation components should be explained.
Verification Review of Program Year 2016 Evaluation Results
23 | P a g e
ENS require the Evaluator to include in the Executive Summary a summary table that
lists all calculated precision and confidence level results across the evaluations along
with an overall target precision and confidence level. This would provide a single
table for review of comparative rigor across evaluation results. Results that do not
meet target precision and confidence levels should explained.
Since dropping evaluation components (process evaluation; market evaluation)
and/or reducing the level of rigor (impact evaluation) can have effects on knowledge
about program implementation and affect results, it is important to put some
boundaries on these changes (dropping and reducing). For example, skipping a
program process evaluation for a year may not be a problem; but skipping more than
two years is likely to fail to detect new problems and skipping more than three years
is not a sound plan. This leads to another general recommendation:
A final general recommendation is based on experience in site visits. When a
program shifts from a restricted and qualified list of approved vendors to the general
market, ENS loses some ability to insure quality. Accordingly, physical site
inspections for such programs should be increased to strengthen quality control.
Recommendation 2016 General-SV-3: In future years, ENS should require
the Evaluator to provide a summary table of obtained confidence and precision
levels along with population and sample size for all calculations for which
confidence and precision levels were developed. Also, the Evaluator should
state the overall targets for confidence and precision for the evaluations, and
provide a discussion of results in the table. Any variation in the targets and any
results that do not meet the targets should be explained.
Recommendation 2016 General-SV-4: In future years, ENS should specify
rules to bound the dropping of evaluation components and to bound the
reduction of program impact evaluations. There should be a systematic
understanding and articulation of risks to evaluations producing usable and
defensible knowledge. For example, a rule could be that impact evaluation for a
program will be reduced no more frequently than every other year. Or that
process evaluations will skip more than two years in a row. It is quite possible
to do some trade-offs to conserve dollars, with the approval of the Advisory
Group. But these kinds of trade-offs need to be carefully watched and rules set
up to protect the continuing validity and precision of evaluation results.
Verification Review of Program Year 2016 Evaluation Results
24 | P a g e
IX. General Considerations
Basis for recommendations for consideration in this section of the study is provided in
Section XI, Looking toward the Future of Evaluation, Looking toward the Future of
Evaluation (P. 62).
The Savings Verification study recommends consideration of extending the time
dimension of the evaluations beyond the attenuated limit of first-year impacts to
facilitate reader comparisons of measures, programs, sub-portfolios and the overall
portfolio of programs.
We recommend consideration of periodic persistence evaluations for specific
measures and/or programs to empirically “true-up” planning assumptions and
evaluation results with field realities.
The Savings Verification study recommends consideration of moving to a market
transformation evaluation for a small number of programs/measures. These are
addressed in the individual program recommendations.
Recommendation 2016 General-SV-5: ENS should increase on-site
inspection for quality control for all programs that shift from ENS control to
reliance on markets.
Consideration 2016 General-SV-1: In addition to reporting first-year impacts,
ENS and the Advisory Group should consider the benefits and cost of also
providing savings estimates based on expected lifetimes to support
comparisons among programs and to improve foresight.
Consideration 2016 General-SV-2: For program types that permit, ENS and
the Advisory Group should consider the potential benefits and costs of
introducing selected persistence evaluations to empirically document lifetimes
and persistence of savings.
Verification Review of Program Year 2016 Evaluation Results
25 | P a g e
Since climate change is affecting Nova Scotia each year, DSM helps to mitigate
climate problems and the intersection of DSM and climate is evolving, consideration
should be given to requiring the Evaluator to report non-utility benefits for climate
mitigation.
We also recommend consideration of initiating discussion with the relevant provincial
agency and climate adaptation advocates of the intersection between DSM and
climate adaptation, particularly in new construction.
X. Individual Program Component Review
Econoler conducted evaluations of six electricity efficiency programs, several with
DSM components or initiatives that had been separate entities in past years. As
noted earlier, for the Savings Verification examination, we focus primarily on impact
evaluation.
1) Residential Efficient Product Rebates Program
The Appliance Retirement program (sometimes referred to as “ARET”) is an
appliance pick-up program (it does not replace appliances). The program concept is
permanent removal of inefficient appliances from the grid by deconstructing them and
recycling their components. The recycling process is environmentally sound. The
Consideration 2016 General-SV-3: ENS and the Advisory Group should
consider the potential benefits and costs involved to employ a market
transformation framework. This would apply to a limited subset of
programs/measures (see specific program recommendations).
Consideration 2016 General-SV-4: ENS and the Advisory Group should
consider the potential benefits and costs involved in requiring the Evaluator to
report non-utility benefits for climate mitigation.
Consideration 2016 General-SV-5: ENS and the Advisory Group should
consider initiating discussion with the province and climate adaptation
advocates of the intersection between DSM and climate adaptation, particularly
in new construction.
Verification Review of Program Year 2016 Evaluation Results
26 | P a g e
Appliance Retirement Program includes full-sized and small refrigerators, full-sized
and small freezers, and room air conditioners.
The 2016 evaluation of the Appliance Retirement program was based on a program
documentation review, an interview with the program manager, a participant survey,
one visit to the recycling facility, five on-site visits to participating households during
the appliance retirement process and a unitary savings review. For the unitary
savings review, the Uniform Methods Project Refrigerator Recycling evaluation
protocol was implemented by the evaluator.9 The unitary savings values of each
type of appliance retired through the Appliance Retirement Program were revised
based on the characteristics of the retired appliances documented in the tracking
system and information from the survey of 198 Appliance Retirement Program
participants. The survey gathered information required to compute free ridership and
spillover.10
The evaluation of net impact accounted for free-ridership, secondary market impacts,
induced consumption and internal spillover. The part-use factor was included for
refrigerators and freezers, and hours of operation were used for air conditioners.
Interactive effects were considered and set to zero.
Due to programs through which DSM Administrators, manufacturers and government
agencies work together to improve refrigerator standards, the energy consumption of
refrigerators has been successfully dramatically reduced from the early 1970’s, while
price of units been reduced and volume (cubic feet within refrigerators) has
increased. There has been very strong improvement in energy efficiency. Similar
improvements have been guided through similar cooperation for freezers and window
air conditioners. In 2016, eligibility for this program was changed to allow alliances to
be retired to be ten years old rather than fifteen years old. This resulted in a dip in
overall program impacts due to the continuing decline in electricity required to run a
refrigerator, but an increase in the number of appliances retired.
The evaluator recommends incremental improvements to metering at the recycling
facility. This Savings Verification study supports Econoler’s Recommendation 2016
9 Uniform Methods Project Refrigerator Recycling Evaluation Protocol. See:
(http://www1.eere.energy.gov/wip/pdfs/53827-7.pdf). Prepared by Doug Bruchs and Josh Keeling,
The Cadmus Group, April 2013.
10 In addition, the Appliance Retirement Program conducted some metering, with a focus on small
appliances at its workshop. However, only a small number were metered; too small a number to make
the data useful for evaluation calculations.
Verification Review of Program Year 2016 Evaluation Results
27 | P a g e
ARet-R1. The savings verification study has no additional Savings Verification
recommendations for this evaluation area.
2) HomeWarming Program
Appliance savings impacts from the HomeWarming (low-income) program are
included in the impacts of the Appliance Retirement Program. The HomeWarming
program picks up and recycles (and replaces with efficient appliances) refrigerators,
freezers, room air conditioners and dehumidifiers. It differs from the Appliance
Retirement program in that it also provides efficient replacement appliances.
HomeWarming evaluation calculations generally follow those for the Appliance
Retirement program but with some differences. The differences are due to different
rules for qualifying appliances in the low-income program and to different
assumptions for low-income households.
2016 ARet-R1. Continue to perform metering activity and keep improving its quality and effectiveness. This year, metering activity was focused on the small appliances retired since they were made eligible for the program component for the first time in 2016. However, the number of units metered was too small and the volume of data collected was not enough to support an impact evaluation. Another change made this year was the reintroduction of the “10 years old or older” eligibility criterion, after accepting only appliances of more than 15 years of age in 2015. Consequently, two types of appliances should be particularly targeted for metering in 2017, namely small appliances and full-size appliances with a more recent manufacture year class (in order to include appliances between 10 and 15 years of age). To help ensure that sufficient data is collected for the 2017 evaluation and improve metering data precision level, the Evaluator recommends planning the metering activity as early as possible in 2017. The Evaluator’s visit to the recycling facility and analysis of the raw metering data revealed that the metering process can be improved by:
› Using the raw data from the current transformer instead of the single final value from the plug-in meter to calculate energy consumption values. This will allow validating the proper functioning of the metering equipment since consistency in amperage can be verified over time. › Logging refrigerators’ and freezers’ internal temperatures throughout the metering process instead of recording the internal temperatures only at the beginning and the end of the metering period. › Establishing a calibration procedure for the data-logger and the plug-in meter, to be followed before the metering starts.
Verification Review of Program Year 2016 Evaluation Results
28 | P a g e
In the combined Appliance Retirement and HomeWarming evaluation, the methods
and analytic approach are fully within the scope of an independent evaluator. In
addition, the evaluation is excellent because it follows a “better practice” universal
evaluation protocol for this program type and because it incorporates some metering
to provide grounding to calculated values. For this evaluation, the evaluator carried
out all the necessary detailed steps and included all factors that might reduce
program energy savings. Judgments and weightings were carefully carried out. This
is an excellent evaluation.
3) Instant Savings Evaluation
Instant Savings is an “upstream discount” program experienced by residential
customers through discounts on energy efficient products when making purchases in
stores. It is both a resource acquisition and a market transformation program.
Items for which a rebate was provided in 2016 are:
• ENERGY STAR® Canada certified light emitting diode (LED) lamps, fixtures,
and recessed downlight fixtures
• Dimmer switches
• Indoor and outdoor motion sensors
• Power bars with integrated timers
• Load-sensing power bars (smart strips)
• Heavy-duty outdoor timers
• Programmable thermostats for electric baseboard heaters
• Outdoor clothesline kits and outdoor clothes dryers
• Refrigerators (qualifying models use 20 kWh per cubic foot of adjusted volume
or less)
• Clothes Washers (qualifying models are on the ENERGY STAR Most Efficient
2016 list)
Appliance rebates were available all year; the other items were rebated during spring and fall campaigns. Eleven chains plus thirty-four independent stores participated. Savings were unusually high, due largely to an unanticipated sales campaign for rebated LEDs at a competitive price by one retailer. There was also a large increase in sales for efficient rebated refrigerators. The evaluation makes use of delivery agent and store staff interviews, in-store visits, intercept surveys (n=246) and careful examination of unitary savings. This evaluation involves an individual detailed calculation for each type of instant savings product. Calculation of unitary savings required borrowed data in several areas and appears to
Verification Review of Program Year 2016 Evaluation Results
29 | P a g e
use relevant criteria (recent Technical Reference Manuals and reports of other jurisdictions) for selection of studies. The Evaluator conducted a thorough literature review and performed engineering calculations to revise the unitary savings values used in the tracking sheet. Econoler provides two recommendations for this program. The Savings Verification
study supports both Econoler recommendations.
Also, we note that both recommendations cross over from the simpler resource
acquisition paradigm to the more sophisticated territory of the market transformation
paradigm. For this reason, we recommend the 2017 evaluation consider use of a
market transformation evaluation rather than a market effects evaluation (2016
Instant SV-R1).
2016 Instant-R2. Closely monitor the evolution of the refrigerator and clothes washer markets. Effective 28 June 2017, a new Canadian energy efficiency regulation identical to the current American regulation comes into force. This regulation will impact the energy efficiency level of refrigerators and clothes washers. It might also significantly affect the baseline energy consumption and thus, the unitary savings value of appliances sold through Instant Savings. It could also impact product offerings, for instance by increasing the number of models that meet ENERGY STAR guidelines and Consortium for Energy Efficiency (CEE) Tier III standards. The Evaluator recommends that this situation be closely monitored and analyzed in 2017 to confirm whether it is still relevant and cost-effective to offer incentives for refrigerators and clothes washers and, if so, select the right energy performance criteria to ensure a certain level of savings and impact in the market
2016 Instant-R1. Continue monitoring key market indicators to adapt program component offerings when needed. Increasing adoption of LEDs by consumers and changes in retailer offerings were noticed during the 2016 evaluation. The high number of LEDs sold during the 2016 year (either within or without the campaign periods) and the declining retail price of LEDs (see Subsection 6.2) indicate market barriers have been reduced, especially for A-type LEDs. The Evaluator recommends continuing to monitor key market indicators such as the price of LED lamps and number sold, participating number of retailers, customer awareness and experience with LED lamps, as well as their penetration rate in Nova Scotia households. These data should be deeply analyzed in the next evaluation to determine if the significant changes observed in 2016 persist over time and to be able to recognize the point where program interventions, in their current form, will no longer be needed for LED lamps, i.e., the point where market transformation is achieved. Based on the state of the market, different exit strategies that can lead to continuous energy savings even after a program or incentive has ended could be investigated.
Verification Review of Program Year 2016 Evaluation Results
30 | P a g e
4) Home Energy Assessment
Home Energy Assessment (HEA) provides financial incentives in the form of rebates
or zero-interest financing to homeowners to reduce consumption of energy. For
several years, HEA focused on shell insulation measures and in 2016 expanded focus
to include space heating equipment (for example, wood/pellet stoves and heat
pumps) and water heating equipment (solar domestic water heaters and heat pump
water heaters). In 2015 provincial funding for HEA was discontinued. This limits
service to include only electrically-heated houses. The electricity savings per house
increased in 2016 since only homes with electric heat were treated, changing the
base for the per home calculation of program impacts. (In previous years, homes with
secondary electric heat were included in the program).
The evaluation consists of a validation of the data in the program tracking system
and inclusion of the Net to Gross Ratio based on survey results (n=104) for free
ridership and spillover. The program design incorporates a “test-in” pre-audit of the
home (referred to as the D audit) plus a “test -out” post-audit (referred to as the E
audit). These audits are essential to the program and are also used in the program
evaluation.
For ten selected sites, both an on-site visit and HOT 2000 simulation modeling was
conducted. For the ten sites, file information from the program was checked against
directly observed data from the site visits. The evaluator found that for six sites (out
of ten examined) the recorded data tracking information contained errors that caused
the accuracy of energy consumption calculations to be off by more than three percent
(failing the NRCan HOT 2000 guideline for accuracy). However, for the ten projects
considered together, the group error was one percent. For residential programs is a
typical pattern of spread for individual homes, nevertheless meeting the precision
target for the group. House-level results vary, but for a group of houses errors tend
to balance out, providing a stable program estimate. In carrying out the calculations
Recommendation 2016 Instant-SV-R1: For A-type LEDs, the 2017 Instant Savings Program Evaluation ENS should consider the costs and benefits of using a market transformation paradigm rather than a resource acquisition approach (including study of market effects). This would include development of the S-curve for the A-type LEDs and fully discuss market transformation program implications. The evaluation would also discuss monitoring A-type LEDs in future years to confirm the stability of a transformed market, and the attribution of subsequent market impacts.
Verification Review of Program Year 2016 Evaluation Results
31 | P a g e
for the D and E assessments, an overestimation ration of 16 percent was used to
ratio down savings. The overestimation ratio was established by the evaluator from a
billing data analysis (correcting HOT 2000 model estimates using utility energy usage
data) in the 2015 HEA evaluation. Since the overestimation ratio was established in a
recent year, it is reasonable to use it for 2016.
Free ridership, spillover and the net to gross ratio were computed. Savings
impacts from participants who started the program (received the D Audit)
but did not finish with the E Audit were calculated and included in spillover.
This part of the calculation was based on telephone survey data from the
subgroup who started, but did not finish the program. This is a reasonable
inclusion of additional data.
The evaluator has three recommendations for HEA, which the Savings Verification
Study also supports (2016 HEA-R1, 2016 HEA-R2 and 2016 HEA R-3):
2016 HEA-R1. Assign staff to answer EA questions about HEA. To clear some of the confusion created by the changes HEA has undergone since April 2015, EAs asked for a single ENS contact to answer their questions and inquiries, especially about eligibility requirements. EAs also mentioned the need for additional training for themselves. Facilitating transmission of information to EAs could also have a positive impact on participants who mentioned unclear HEA requirements as a source of dissatisfaction with HEA in 2016.
2016 HEA-R2. Use the unitary savings established as part of the 2016 HEA evaluation to determine the savings generated by the DHW measures. During this evaluation, the Evaluator assessed how best to determine the savings associated with DHW measures recently introduced in HEA, and established unitary savings for each type of DHW system. Unitary savings values were favoured over the simulation results to provide more accuracy and consistency with the savings claimed among the different residential program components. Only a small number of DHW measures (two) were installed by HEA participants in 2016, but since this number is expected to increase, the Evaluator recommends integrating the unitary savings values as the method to claim DHW measure savings in the HEA tracking system.
Verification Review of Program Year 2016 Evaluation Results
32 | P a g e
The Savings Verification study has no additional recommendations for
evaluation of this program.
The method, decisions and calculations performed by the evaluator are appropriate
to this program type and within the scope of an experienced independent evaluator.
They are intelligent and well-developed based on cumulative experience over
several years.
5) Green Heat
The Green Heat Program focuses on both full replacement and supplementation
of heating systems by installation of equipment that uses fuel derived from
renewable resources. Participants can choose between a rebate and a zero-interest
financing option. Incentives are available for central heating systems, as well as to
supplement electrical heat with an area heater. The following categories of heating
system are rebated:
• Biomass: high-efficiency wood and pellet stoves, and central heating systems
• Heat pumps: high-efficiency space heating (e.g. high efficiency mini-split heat pumps) and domestic hot water heating
• Solar thermal: solar domestic hot water and solar space heating systems
Several equipment options are available. However, the preponderance of measures
(85%) installed through Green Heat in 2016 were mini-split heat pumps.
2016 HEA-R3. Maintain follow-ups with EAs and continue to conduct project reviews to assess the progression of simulation file quality. As part of the 10 HEA project reviews (on-site visits and HOT200 simulation reviews), the Evaluator noted that EAs generally performed well in recording household energy component data, though there were some differences in the quality of the reviewed simulation files. The Evaluator noted opportunities to increase the accuracy of HOT2000 simulations and recommends sharing with EAs those aspects that ENS would like to improve. ENS already conducts follow-ups with EAs and the Evaluator recommends continuing to do so. Project reviews should also be conducted on a regular basis to continue monitoring the quality of simulations and monitor progress. Finally, if the HOT2000 software should ever be modified, or if considerable improvement is made to the quality of simulation files, the overestimation ratio should be reviewed with a new billing analysis to increase accuracy of the savings calculations.
Verification Review of Program Year 2016 Evaluation Results
33 | P a g e
Participation increased in comparison with past years, primarily due to the interest in
high efficiency mini-spilt heat pumps. Average savings per measure was lower than
in past years, primarily because (though cost-effective) mini-split heat pumps have
lower savings values than some alternative equipment.
The evaluation of Green Heat is based on document review, interviews with the
program manager, interviews with retailers (n=10), a telephone survey of participants
(n=90), on-site visits (n=25), and a billing analysis of mini-split installations from 2015
(n=36. There were also three simulation analyses and review of the data in the
program tracking system. Free ridership was developed from the participant survey
by category of measure and the net-to-gross ratio was constructed using these
values.
The evaluation showed a slightly lower net-to-gross ratio than program calculations
and used a lower savings for the mini-spilt heat pumps. The billing analysis of mini-
split heat pumps installed during 2015 showed a lower pre-program electrical space
heating consumption than expected, suggesting the possibility of other non-electrical
heat sources in this group of homes. Free-ridership information was developed from
the participant survey, by category of measure.
Econoler’s evaluation provides five recommendations. The Savings Verification study
supports all five. Recommendations 2016 GH-R1 and 2016 GH-R2 are market
analysis recommendations to improve selection of more efficient technology and to
improve promotion of higher levels of efficiency.
2016 GH-R1. Monitor market progress to adapt the program component offer to the evolving market. According to the retailers of biomass measures and heat pumps interviewed as part the 2016 evaluation, the market has changed a lot in the last few years. Changes were noticed in consumer awareness and interest, variety of equipment offered, levels of product quality, and price. Moreover, some of the products rebated were mentioned as very popular among the portion of consumers who are not very price sensitive. The Evaluator recommends conducting general population surveys or market actor consultations to better understand how the market is evolving and to what extent the market has transformed. Depending on the level of market transformation achieved and the barriers still found in the market, the program component offer (eligible products, rebate amounts) and delivery should be adapted. If necessary, eligibility criteria should be revised to ensure products offered through Green Heat are the best in terms of energy efficiency and not currently widely purchased on the market. These data collection activities could also be used to start monitoring market indicators of new products as soon as they are introduced in the market, which would enable both better understanding of program influence at the market level and further study of the market transformation.
Verification Review of Program Year 2016 Evaluation Results
34 | P a g e
Recommendations 2016 GH-R3, 2016 GH-R4 and 2016 GHR5 will improve
measurement for Green Heat.
2016 GH-R5. Continue researching more accurate sources to estimate the runtime
hours of central heating systems. The Evaluator found new sources from available
literature that estimated equivalent full load hours for central heat pumps based on
energy modelling. This represents an improvement over the previous estimate which
was based on a standard value. However, the Evaluator considers that accuracy could
be further improved by accounting for actual conditions under which heating systems
operate in ENS participants’ homes. For instance, the billing analysis established
average energy savings for MSHPs at 4,990 kWh, despite an average installed capacity
of 21,509 Btu/h. This suggests that MSHPs generally do not run for as many annual
hours as originally estimated, notably because of the presence of other heating systems.
Conducting a similar billing analysis or a metering study for central heat pumps would
therefore allow factoring in parameters that are representative of equipment installed
under Green Heat, such as control strategies and interactions with other heating
systems.
2016 GH-R2. Increase program component advertising to households and
collaborate with retailers. Evaluation results showed fairly high free-ridership levels for
products rebated through Green Heat, especially for heat pumps and MSHPs. The
Evaluator recommends focusing marketing efforts on (1) better communicating with
households, and (2) collaborating with retailers. Advertising the program component
more effectively to households could contribute to lowering the free-ridership level by
reaching out to homeowners who have not yet decided to replace or supplement their
equipment with heat pump, solar or biomass units. Indeed, many surveyed participants
conceded that they had already made the decision to install a high-efficiency system
before learning about Green Heat. For homeowners who have already decided to
change their heating system, retailers could encourage them to choose the most efficient
product by offering the necessary information to customers. However, both lack of point-
of-sale material and retailer training result in retailers poorly understanding Green Heat.
The Evaluator suggests increasing program component knowledge among retailers by
providing them with an impetus to more actively promote higher efficiency models, and
offering point-of-sale promotional material. ENS could request retailer inputs about which
equipment to promote to achieve energy savings that would not occur otherwise.
Verification Review of Program Year 2016 Evaluation Results
35 | P a g e
To these evaluator recommendations, we add one Savings Verification
recommendation to consider restructuring the 2017 evaluation as a market
transformation evaluation rather than as a resource acquisition evaluation for the mini-
split heat pump measure. At this stage in the maturity of the program and the
evaluation, shifting to a market transformation paradigm is appropriate.
The methods and calculations performed by the evaluator are appropriate to this
program type and within the scope of an experienced independent evaluator.
2016 GH-R4. Perform the MSHP billing analysis again in 2017. The billing analysis
conducted as part of the 2016 evaluation resulted in a major change made to the
average savings value for MSHPs. Although this analysis yielded conclusive results, the
Evaluator believes that it should be conducted again in 2017 for two reasons: (1) to
perform the analysis with a larger sample of participants who will provide sufficient billing
data; and (2) to take into consideration any major changes that may occur in 2017 as the
Green Heat management team makes adjustments to this relatively new measure.
Recommendation 2016 Green Heat-SV-R1: In the 2017 evaluation for the Green Heat
high efficiency mini-split heat pump measure, ENS should consider the benefits and
costs of structuring the evaluation as a market transformation evaluation. Continuing
treatment as a resource acquisition evaluation with an expanded market evaluation is
stretching against the boundaries of that type of evaluation approach.
2016 GH-R3. Consider real baseline conditions in the savings calculation. The
review of the tracking sheet and the results of site visits revealed that many participants
already owned either an efficient heating system or a secondary heating system that
meets a large portion of their heating needs, thereby reducing the energy savings
potential of their home. The billing analysis rendered evidence to support this impact, as
the pre-installation electrical heating load was estimated at only 11,413 kWh (compared
to a previous estimate of 14,600 kWh for a completely electrically-heated home). It is
therefore recommended that the efficiency of existing electrical systems and the
presence of secondary systems be considered when estimating savings, rather than
using unitary savings values that are based on a heating load entirely supplied by an
electrical resistance heating system.
Verification Review of Program Year 2016 Evaluation Results
36 | P a g e
6) Residential Direct Install
The Residential Direct Install Program (RDI) provides direct installation of standard
energy-efficient lighting and domestic hot water-heating products free of charge to
both homeowners and renters (renters in building with 20 or fewer apartments). The
products for 2016 were:
• Light Emitting Diode (LED) lamps including LED reflector lamps (to replace
halogen lamps) and LED chandeliers • LED nightlights • Faucet aerators (for electric domestic hot water systems) • Low-flow showerheads (for electric domestic hot water systems) • Pipe insulation (for electric domestic hot water systems) • Hot water tank wraps (for electric domestic hot water systems) • Smart power controllers for audiovisual equipment Compact fluorescent lamp treatments were discontinued in April 2015 and all water
consumption and insulation measures were directed to electrically heated DHW
homes only. For low-income participants, the program also receives funding from the
Province of Nova Scotia for upgrades that reduce the use of fuels other than
electricity; this part of the program is not covered by the evaluation.
The evaluator conducted interviews with the Program Manager. There were 100 site
visits, and a telephone participant interview was also conducted (n=250). Tracking
system data was reviewed. A review of spillover, interactive effects and a unitary
savings review were conducted. Adjustment was made for displaced wattage.
Hours of lighting operation was derived from selected secondary sources (a new
lighting study will provide Nova Scotia data for the 2017 evaluation). Results for
smart power controllers were also based on borrowed data. Calculations were
constructed by measure category. Free-ridership, internal spillover and installation
rates were developed from the participant survey.
The evaluator provides two recommendations, both for the smart power controllers.
The Savings Verification study supports both recommendations, and has no
additional recommendation for Retail Direct Install.
Verification Review of Program Year 2016 Evaluation Results
37 | P a g e
7) Rental Properties and Condos Service
The Rentals Properties and Condos Service program provides direct free-of-charge installation of energy-efficient products in both common space areas and rental units to landlords, renters or condominium owners. Installations in common areas are allocated to the Business Energy Solutions (BES) program, so this evaluation is confined to work completed in individual apartments or condominiums. Measures included for residential units for 2016 are:
• Light Emitting Diode (LED) lamps
• LED nightlights
• Hot water tank wraps
• Pipe insulation
• Faucet aerators
• Low-flow showerheads/shower wands Only upgrades that reduce electricity use are reported in the evaluation. Upgrades
2016 RDI-R1. Limit the number of smart power controllers installed to two per household. The tracking report indicated that 589 RDI participants each received three or more smart power controllers. The Evaluator believes that the third and the fourth controller installed in one house are unlikely to generate the same amount of savings per unit as a unique controller installed per house, since their respective audiovisual systems may be used only occasionally. Also, of the eight participants visited who had received more than one controller, only one still had all the controllers installed, resulting in an average installation rate of 30 percent for this group of participants. Although these results have been obtained from quite a small sample, the Evaluator still believes that having more than two smart power controllers installed might not be an effective way to maximize RDI’s impacts.
2016 RDI-R2. Conduct more on-site visits with participants who received smart power controllers. Based on the information collected from the 44 site visits conducted with participants who had smart power controllers installed in 2016, a fairly low installation rate was observed. These 44 visits conducted did not provide enough data needed for establishing an installation rate within an acceptable margin of error. Since a site visit is considered a much more reliable means to validate the installation of smart power controllers than a telephone survey, it is recommended that additional visits should be conducted in early 2017 so as to determine with confidence whether or not the installation rates have decreased in comparison with the previous years.
Verification Review of Program Year 2016 Evaluation Results
38 | P a g e
that reduce use of other fuels are paid for by the Province of Nova Scotia. These are not included in the evaluation. Review of program documents, review of the information in the tracking database, on-site verification (n=45) and a telephone survey (n=79) were conducted by the evaluation team. Program savings estimates were reviewed and the evaluator also conducted a literature review. The evaluation included a unitary savings review for each type of product, based on the literature review and engineering calculations. For lighting, displaced wattage was calculated based on tracking data and hours of operation were based on borrowed data. On-site verification was used to establish installation rates and quality of products installed. Free-ridership and spillover and the net-to-gross ratio were computed, using survey information. A net-to-gross ratio of 1.00 was assumed for low-income units. The evaluator provides one recommendation for the Rental Properties and Condo
Services Program. The Savings Verification study supports this recommendation
(RP&C-R1). We have no additional recommendations for this evaluation.
The methods and calculations performed by the evaluator are appropriate to this
program type and within the scope of an experienced independent evaluator.
8) New Home Construction Program
The New Home Construction program is an electric program designed to encourage
homeowners and builders to exceed building code requirements for energy
efficiency in new homes. The program includes a review of house plans using the
HOT 2000 software. Simulation results and plan review are provided to the
homeowner and the builder to project energy consumption of the home and to
develop recommendations for improvement. The program can certify ENERGY
STAR® Canada, R2000 or improvement beyond the national building code’s
RP&C-R1. Start measuring spillover in 2017. Spillover has never been measured
for RP&C because it has been assumed that RP&C replaces all the inefficient
products that could be upgraded by tenants. However, as the lighting market evolves,
the Evaluator has observed some spillover in RDI, another direct-install program
component. For instance, compact fluorescent lamps are being replaced by LED
lamps, and mini-split heat pumps are being installed. Similarly, although most RP&C
participants are renters and are presumably less likely to invest in upgrading their
homes, they are nonetheless still likely to start installing some of these measures
without receiving any incentive encouraging them to do so. Therefore, the Evaluator
recommends starting to measure spillover for RP&C in 2017.
Verification Review of Program Year 2016 Evaluation Results
39 | P a g e
minimum requirements. There is a final inspection of the completed home to verify
the efficiency level of the home and to test air-tightness using a blower door test.
The current incentives are $1,000 for achieving ENERGY STAR (or an EnerGuide
rating of 85 to 87) or an incentive of $2,000 for homes that achieve R2000 (or an
EnerGuide rating of 88 or higher).
Evaluation was based on a program documentation review, an in-depth interview
with the program manager, a telephone survey of participants (n=19), a telephone
survey of participating and non-participant builders (n=46) and on-sites with
verification HOT2000 simulations (n=10). Simulation modeling used the to-code
energy consumption value incorporated in the tracking system as a baseline.11 The
to-code value was used for almost all houses. Only electric savings were claimed.
Electric savings were not claimed for homes with a back-up electric heating system
and so were slightly understated. For the evaluation, an overestimation ratio of
sixteen percent was used.12 The evaluation calculated free-ridership and spillover
and the net-to-gross ratio, primarily based on the two surveys. Participant builders
had more weight in construction decisions than the participant home owners. This is
a reasonable choice because builders develop some houses independently and work
with homeowners on other homes.
The verification HOT2000 simulations found several errors in coding and missing
pieces of required documentation. For five of the ten houses subject to verification
the errors resulted in at least a difference beyond three percent, an accuracy
standard used by Natural Resources Canada. The evaluator provided two
recommendations for this program, addressing these concerns. The Savings
Verification study supports these two evaluator recommendations 2016 (NHC-R1 and
2016 NHC-R2). We have one additional recommendations for this evaluation.
11 For four of the 565 participant houses, the to-code option was not in the tracking system yet when
work was begun. For these homes, the old system, an EnerGuide baseline of 78 was used.
12 The overestimation ratio was derived in the Econoler 2015 billing analysis of HEA participants
between 2013 and 2015.
Verification Review of Program Year 2016 Evaluation Results
40 | P a g e
The methods and calculations performed by the evaluator are appropriate to this
program type and within the scope of an experienced independent evaluator.
9) Business Energy Rebates
Business Energy Rebates is a mail-in and instant rebate program that provides
prescriptive rebates to business, non-profit and institutional (BNI) organizations.
Incentives are distributed through a combination of mail-in and instant point-of-sale
rebates. The prescriptive measures covered by Business Energy Rebates are
2016 NHC-R1. Continue conducting project reviews to assess the evolution of the quality of simulation files. As part of the 10 NHC project reviews (on-site visits and HOT2000 simulation reviews), the Evaluator noted that EAs generally performed well in recording household energy component data, though there were some differences in quality of the simulation files reviewed. The Evaluator is aware that training was provided to EAs in 2016 and therefore recommends conducting project reviews on a regular basis to keep monitoring simulation quality and monitor progress over time. The Evaluator noted some recurrent mistakes in the HOT2000 simulations (see Subsection 5.1) and will follow up on these aspects during the 2017 evaluation. In the meantime, the Evaluator recommends sharing with EAs those aspects of the project review process that ENS would like to improve
2016 NHC-R2. Ensure that all required documentation is included in participant files. Though EAs generally performed well in recording household energy component data, the Evaluator could not validate some of the information entered in the simulation files because some required documentation was missing for eight of the 10 projects reviewed, such as specification sheets and drawings. According to NRCan, all documentation required to perform a quality assessment should be provided. Specification sheets should at least be provided for heat pumps and heat exchangers. All participant files should include sufficient details about the building envelope either using a detailed final design evaluation report, or the necessary section views and component drawings to indicate all material layers, insulation levels and thickness contained in wall and ceiling components.
Recommendation 2016 New Home Construction-SV-R1: ENS should consider meeting with the appropriate provincial agency and environmental advocates to discuss the evolving intersection of DSM with climate adaptation, particularly for new construction and major renovation projects such as Net Zero and Passivhaus projects.
Verification Review of Program Year 2016 Evaluation Results
41 | P a g e
grouped into several categories: agricultural, commercial lighting, commercial
refrigeration, commercial laundry, commercial kitchen, compressed air, heating, IT &
datacenters, variable frequency drives, commercial water heating, solar thermal and
pumping.
The evaluator conducted an interview with the program manager. For mail-in
participants, the evaluator conducted a telephone survey (n=70), on-site visits and
project reviews (n=81). The on-sites/reviews validated the fit between tracking sheet
information and exact and proper installations. Also, on-site information was
gathered on hours of operation, conditions of operation, and (where possible) old and
new equipment characteristics. For instant rebates, the evaluator conducted
interviews (and then a second-round Delphi panel on free-ridership) with instant
rebate distributors (n=9) as well as a unitary savings review for each measure-type.
Free-ridership and internal spillover results for mail-in rebates were calculated using
results from the mail-in participant surveys and on-sites. Free-ridership for instant
rebates was calculated from responses to the distributor interviews. The evaluator
developed a unitary savings calculation, determination of free-ridership, internal
spillover and net-to-gross ratio. The evaluator applied these to verified installation
rates and system tracking data. Where information was not available, the evaluator
developed engineering calculations.
Results indicate that while the rebates for LED products remains useful, their effect is
declining as the lighting market is being transformed. Lighting measures represent
nearly all instant rebate savings and the largest measure category for mail-in rebates.
The evaluator expects the transformation of markets for LEDs to have a major impact
on the Business Energy Rebates program.
Four recommendations were provided by the evaluator:
Verification Review of Program Year 2016 Evaluation Results
42 | P a g e
2016-BER-R1. Continue improving the BER Mail-in internal review procedure for savings calculations. In 2016, the Evaluator applied large adjustment ratios to five measure categories in BER Mail-in. The lighting measure category required the most adjustments for a number of reasons: (1) hours of use (HOUs) were not representative of facility schedule; (2) ballast factors were used inconsistently; and (3) baseline wattages were sometimes inadequate. The Evaluator encourages ENS to implement measures to specifically address these issues, such as modifying the application form to gather more detailed information on HOUs, adding a verification step specific to the inclusion of the ballast consumption for baseline wattages and using the efficient fixture specification sheet to establish the baseline for new construction applications.
2016-BER-R2. Implement complete measurement and verification (M&V) procedures for Mail-in projects generating large energy savings. The Evaluator has observed that large projects are increasingly part of BER’s Mail-in project portfolio. For simpler and smaller projects, relying on documentation provided by the participant and performing simple validations may be adequate to establish savings with a fairly good level of accuracy and reliability. For large projects, the Evaluator recommends that a thorough and systematic M&V protocol be implemented, which should include on-site validation, verification of technical documentation and, where warranted, measurement of key parameters. ENS is currently developing such M&V procedures for Retrofit projects. The same procedures should be applied to BER projects generating 300,000 kWh and more to limit the corrections needed to the tracked energy savings.
2016-BER-R3. Conduct deeper market analysis to adapt the BER Instant Rebates offer when needed. Strong market evolution was noticed for 2016. The high number of LEDs sold during the 2016 year, their declining retail price and the increase of free-ridership levels for these products indicate that market barriers have been reduced. Moreover, LED linear fixtures are more often included in the plans and specifications of large new construction projects. The Evaluator therefore recommends conducting deeper market analysis in the next evaluation to determine if the significant changes observed in 2016 persist over time and to recognize the point at which some product categories should be removed from the Instant Rebates service due to a high level of market transformation. To obtain insights and data from the BNI market, which is a challenging task, the Evaluator recommends multiplying data sources as much as possible. For example, interviews with contractors might be performed to confirm the input obtained from interviews with distributors. Sales data of non-efficient lighting products, if available, might also allow comparing the sales level of products rebated through Instant Rebates.
2016-BER-R4. Update the values used in the BER Mail-in and Instant Rebates calculations to match the 2016 evaluation results. The Evaluator updated a number of values as part of the 2016 evaluation. The methodology for calculating interactive effects was revised, as were the operating hours of heat pumps (equivalent full load hours). It is recommended that ENS update these values in their tracking system for 2017.
Verification Review of Program Year 2016 Evaluation Results
43 | P a g e
This Savings Verification study supports each of Econoler’s recommendations (2016-
BER-R1, 2016-BER-R2, 2016-BER-R3 and 2016-BER-R4). However, for
recommendation 2016-BER-R2 we note that complete M&V procedures may
discourage some large savings projects, so this recommendation must be
implemented with caution and balance. These large savings projects must have
ENS participation in determining required M&V.
In addition, we provide one transformative recommendation for future evaluations.
The evaluation of the Business Energy Rebates program is thorough, well-reasoned
and well-written. This evaluation is fully within the scope of an experienced
independent evaluator.
10) Custom Program
The Custom program for business, non-profit and institutional (BNI) customers
includes three components: (1) Retrofit, (2) New Construction and (3) Building
Optimization. Most projects for 2016 were in the Retrofit component. There were
seventy-six Retrofit projects of which seventy reported full or partial savings for 2016.
This component provides technical and financial support for scoping and feasibility
studies, and for implementing improvements. New Construction provides incentives
to help achieve electricity savings and reduce peak load in new buildings and well as
for additions and major renovations. The New Construction component had
unusually low participation in 2016. Building Optimization provides no-cost and low-
cost energy-efficiency measures, identified through recommissioning. This
component was withdrawn for redesign for part of the year. Overall, the Custom
program has very low free-ridership (10%) with no spillover.
• For Retrofit, there were seventy reporting projects in 2016. The evaluator
conducted desk review and on-sites with simulation model review (n=25), The
on-sites include a participant survey. The survey was used to establish free-
ridership and internal spillover and to develop the net-to-gross ratio (NTGR =
0.90). Interactive effects were calculated.
Recommendation 2016-Business Energy Rebates-SV-R1: ENS should consider the benefits and costs of structuring the 2017 evaluation for the LED measures for Business Energy Rebates as a market transformation evaluation rather than as a resource acquisition evaluation with a market evaluation. component.
Verification Review of Program Year 2016 Evaluation Results
44 | P a g e
• For New Construction, there were six projects in 2016. The evaluator
conducted on-sites with simulation model review (n=3). For this component,
the baseline for calculating savings is the original design intent of participants
rather than current practice. This tends to lower gross energy savings. The
net-to-gross ratio was set to 1.00.
• For Building Optimization, there were four projects in 2016. Selected service
providers were interviewed (n=3) and files were reviewed along with on-site
visits (n=3). The net-to-gross ratio was set at 1.00.
Six recommendations were provided by Econoler:
2016-Custom-R1. Develop tools to help service providers better promote and deliver Building Optimization. Service providers interviewed noted that participants tend to misunderstand the scale and scope of the Building Optimization participation process and the outcomes to be expected. Therefore, the Evaluator recommends developing marketing materials, such as case studies illustrating the typical activities and outcomes associated with Building Optimization, to assist the service providers in recruiting potential participants. It is expected that having more organizations better informed about Building Optimization’s process and benefits will help drive participation levels higher.
2016-Custom-R2. Identify the root causes of New Construction’s low participation. Although the ongoing slowdown in Nova Scotia’s construction industry is beyond ENS’s control, the Evaluator still considers it necessary to properly examine and analyze the causes leading to New Construction’s low participation level. Since only two participants were interviewed for this evaluation, the Evaluator cannot provide insightful conclusions about the barriers encountered by participants. Interviewing a larger group of participants and, even more importantly, non-participants could provide useful information and insights on how to make New Construction more attractive to more potential participants. These interviews could be conducted as part of the 2017 evaluation.
2016-Custom-R3. Start using the new method for calculating interactive effects associated with Retrofit lighting projects. In 2016, the Evaluator improved the methodology used by ENS for calculating interactive effects in Retrofit and applied it to all commercial program components. It is recommended that this new methodology be applied in 2017.
Verification Review of Program Year 2016 Evaluation Results
45 | P a g e
The Savings Verification study supports all six Econoler recommendations (2016-
Custom-R1, 2016-Custom-R2, 2016-Custom-R3, 2016-Custom-R4, 2016-Custom-R5
and 2016-Custom-R6). However, recommendation 2016-Custom-R5 is problematic
because companies participating in custom retrofit may have their own internal
standard for assessing results and may not want to engage the overhead of
standardized protocols. There is a trade-off between participation and precision. For
this reason, we suggest care and balance in implementing this recommendation.
There may need to be some trade-off to keep good relationships and participation.
This program is different from the mail-in process in Business Energy Rebates
because the Custom team is on-site and works with owners or managers in
developing the measurement process. This provides input into the measurement
process.
In addition, we include one Savings Verification consideration. If we simply look at
downtown Halifax, we see buildings that have lasted (with ongoing maintenance and
2016-Custom-R4. Ensure that a systematic review of Retrofit’s peak demand savings calculations is performed. This year, the adjustment ratio established for Retrofit projects’ peak demand savings was high mainly because the consultants failed to account for the peak demand savings in their calculations, or ENS took too much caution in tracking savings for those projects that were not simple load reductions, such as lighting replacements. In order to better estimate peak demand savings, the Evaluator suggests that ENS require (1) that the consultants include peak demand savings calculations as a mandatory element in their submission packages, and (2) that all projects without peak demand savings be validated through an internal review process.
2016-Custom-R5. Implement a standardized process for validating M&V results and savings for Retrofit projects. Two main types of errors were detected and corrected as part of this evaluation: (1) incorrect assumptions for the operating conditions; and (2) necessary variables omitted from the savings calculations, such as correction factors for tri-phased current and interactive effects. A more significant error was found for one project where the equipment installed was not the same as that assumed in the savings calculations. To improve savings estimation accuracy, ENS should improve its M&V procedures by developing M&V summaries and standardized tools, which would help ensure that all key variables are systematically validated at the time of project close-out.
2016-Custom-R6. Include an estimate of peak demand savings in the M&V plans for Building Optimization projects. By reviewing a sample of three projects, the Evaluator found that peak demand savings should have been tracked for two of them. ENS should require that all M&V plans include either an estimate of peak demand savings or a summary of the energy conservation measures with the potential to reduce peak demand.
Verification Review of Program Year 2016 Evaluation Results
46 | P a g e
major internal renovation) for two-hundred years and more. Nova Scotia leads in
climate mitigation13; here we focus on climate adaptation.14
11) Energy Management Information Systems (EMIS)
The Energy Management Information System (EMIS) provides either an incentive or
zero-percent on-bill financing to customers for the implementation of energy
monitoring systems. Support can include an EMIS audit, the development of a EMIS
plan (EMIS implementation plan) and the installation of the physical metering and
monitoring equipment (implementation of EMIS). EMIS also includes training and
organizational support from audit through achievement of actions using system data
to produce energy savings. For this program type, the Evaluator bases impact
results on actual incremental (yearly) savings.
There were seven participants in EMIS for 2016, four of which were in the stage of
actively generating savings. For the four sites, a corrective adjustment in one
eliminated some false savings. There were also some baseline problems, corrected
by setting the baseline to 2015 rather than 2014 (to yield yearly incremental savings).
It is expected that the method will be further refined in future years. Both energy
savings analysis and peak load analysis were conducted by the evaluator.
Interviews were conducted with the Program Manager, program delivery agents and
participants. The result of zero free-ridership for this program and the net-to-gross
ratio (NTGR) of 1.00 are based on 2015 interview results carried over into 2016.
This NTGR of 1.00 is also consistent with current industry expectation for this
program type due to the complex series of hurdles that facility management must
negotiate for the implementation of the program. According to interview results there
is an opportunity for better performance by integrating EMIS with Strategic Energy
13 See: Nova Scotia’s Action on Climate Change, https://climatechange.novascotia.ca/action-on-
climate-change (accessed 04/10/2017).
14 Climate change involves a problem in perception of time and the speed of change of the main
temperature effect and of separate climate incidents like hurricanes and water shortages. Most
climate projections end at 2100, since uncertainties beyond that point are large. However, discussion
of criteria and coordination with others working on these problems can be of benefit.
Recommendation 2016 Custom-SV-R1: ENS should consider meeting with the appropriate provincial agency and with climate adaptation advocates to explore the evolving overlap of DSM and climate adaptation and to develop a list of climate adaptation criteria for new construction and major renovation projects.
Verification Review of Program Year 2016 Evaluation Results
47 | P a g e
Management.15
We found the EMIS evaluation to be thorough and generally well described.
However, greater detail in how the baselines were established for the projects not
using a regression based baseline would be useful. Also, we would like to see a step
by step table (as indicated in Recommendation 2016 EMIS SV-3). Otherwise the
types of adjustments made to arrive at estimated savings are well reasoned and
explained.
Impact evaluation of EMIS programs invokes a methodology and corresponding
terminology unique to this program. Greater clarity when it comes to defining these
terms and their values by project would be helpful.
The Evaluator raises the issue of persistence of savings in the “Gross Savings”
section on page 46, questioning the use of a one year measure life for EMIS savings
given the belief that some savings would persist without additional ENS intervention.
We agree that savings may persist beyond ENS involvement and encourage ENS
and the Evaluator to consider using a longer measure life based on a current review
of utility programs and evaluation studies.16
At times the EMIS Impact Evaluation discussion related to cumulative or incremental
15 On this recommendation, see, for example, Opportunities for Action on Energy Management
Information Systems for Industrial Customers: A Report for Program Administrators, NEEA Report
E15-292, pp 18-20.
16 See for example Opportunities for Action on Energy Management Information Systems for Industrial
Customers: A Report for Program Administrators, NEEA Report E15-292, 2015, p 11. The authors
state “According to Energy Trust, EMIS has the potential to increase measure life for operational
changes from 1 year to 3 or 5 years.”
Recommendation 2016 EMIS-SV-2: Add the following EMIS related terms to the Definitions (p. iii); Baseline regression, Projected baseline, Adjusted baseline, Cumulative EMIS/EMIS savings, Incremental EMIS/EMIS savings. The EMIS/EMIS definitions for cumulative and incremental savings should reference the relationship between reported energy usage and the projected baseline.
Recommendation 2016 EMIS-SV-1: We agree with the Service Provider recommendation of greater coordination between the EMIS and SEM program. Further, we recommend that ENS consider merging the two programs into a single customer offer. While there are unique attributes of each that would need to be considered when combining SEM and EMIS, other utilities offer a single program for both.
Verification Review of Program Year 2016 Evaluation Results
48 | P a g e
savings and measure life, so that the persistence of savings is somewhat hard to
follow in the text, and seemingly contradictory. For example, the explanation of
cumulative savings in paragraph two of section 12.1 (p 46) refers to the “… savings
achieved with respect to the baseline situation prior to the implementation of the
EMIS” when a more accurate reference would have been to the projected baseline.
Also, consider item 2 in the bullet list at the top of page 47 which says incremental
savings were arrived at by comparing energy consumption in 2016 with energy
consumption in 2015. The difference between these two values would not produce
incremental savings. Probably the Evaluator meant to refer to the comparison of the
2016 adjusted baseline with 2016 energy usage. This example illustrates the need
for more precise terminology for the EMIS evaluation.
The evaluator provides five recommendations for EMIS. The Savings Verification study supports each of these recommendations.
Recommendation 2016 EMIS-SV-3: One way to more clearly communicate the approach is to show the results for each step and each project. Having a table with columns for 2016 Projected baseline, Adjustments, Adjusted baseline, Actual energy usage and Incremental savings would more clearly show the methodology and results. More columns may be required to show certain details such as the adjustment for savings from other ECM programs. Presumably such a table would also make clear how the value of 0.431 “Adjustment Ratio for Energy Savings” was calculated.
2016 EMIS-R1. Continue to support EMIS and identify potential new participants. Participants value the support they receive from their service provider and ENS. Overall, they are highly satisfied with EMIS although for different reasons. Some appreciated the energy savings, while others appreciated the additional information the system provided. Additionally, EMIS 2016 demonstrated potential for additional savings beyond the first year of participation. For these reasons, the Evaluator concludes that EMIS is successful and should be continued and expanded.
2016 EMIS-R2. Continue encouraging EMIS participants to enroll in SEM and strengthen integration between the two program components. There is a natural connection between EMIS and SEM. SEM aims to change the culture of an organization in terms of how the organization views and manages energy. EMIS provides infrastructure and technical support to identify operational and process savings. Many organizations participating in one program component would benefit from the services offered by the other. It is also believed that participation in both EMIS and SEM would increase participant internal energy management capabilities and commitment, thereby helping achieve deeper and more sustainable savings.
Verification Review of Program Year 2016 Evaluation Results
49 | P a g e
2016 EMIS-R3. Track incremental savings rather than cumulative savings. For continuing participants in 2016, the Evaluator modified the methodology used to establish savings so that they correspond to incremental first-year savings. Indeed, savings were tracked with respect to a baseline period that spanned approximately one year prior to the beginning of EMIS activities. When participants continued taking part in EMIS for a second year, their tracked savings were the result of the sum of actions taken since the beginning of their participation and not only the additional actions taken since 2015. This was considered standard industry practice. Since ENS established all of its program component savings targets as incremental savings, the evaluated savings were modified to correspond to that definition. Energy savings that persist year over year only as a result of on-going support or intervention from ENS could also be classified as incremental savings and are not captured under the methodology used in this evaluation because no data on the rate at which savings would decrease in the absence of on-going support is available in the technical literature. As new research is conducted on the impact evaluation of programs similar to EMIS, this methodology might be refined to capture all incremental savings.
2016 EMIS-R4. Investigate ways of using energy management information systems to their full potential. The Evaluator noted that some participants who had been part of the 2015 impact evaluation had not implemented new energy conservation measures in 2016. While sustaining the savings achieved in previous years is a success in itself, the Evaluator believes that this indicates a potential for even greater savings with minimal additional investment by ENS. Since energy management information systems offer such a wealth of information to participants, the program component should ensure that they are used for continuous improvement and generate as much savings as possible. The program component could include additional support to participants or some form of performance incentive for service providers to identify potential new measures after easy, low-cost measures have been implemented.
2016 EMIS-R5. Track major changes to facility operations and equipment and adjust energy baseline consumption as required. The Evaluator identified one project for which the compressed air system had been replaced as part of a Custom Retrofit project in 2016. This modified the baseline energy consumption of the plant and resulted in double-counting savings. The Evaluator made this adjustment, but did not have sufficient data to update the baseline regression; new data should be gathered to update this baseline energy regression in 2017. To avoid this in the future, systematic reviews of facility operational and equipment changes should be implemented by ENS to ensure the accuracy of reported savings.
Verification Review of Program Year 2016 Evaluation Results
50 | P a g e
12) Strategic Energy Management (SEM)
Strategic Energy Management (SEM) is a program that promotes and provides
training and tools for continuous improvement in energy savings through
improvement energy policies and improvements to processes. Like EMISS, SEM
provides support for an energy plan and engage in energy management. SEM
participants can be transferred to EMIS.
There were twenty-one new SEM projects in 2016 and twenty-four projects in all.
The 2016 evaluation is based on an interview with the program manager, one service
provider and six participants. There were also desk reviews of eleven projects and
there were five on-site visits. Free ridership and spillover were assumed to be zero.
We find this assumption to be reasonable due to the near zero probability that an
organization attempt a program of this type without ENS encouragement and active
and ongoing assistance.
We found the SEM evaluation to be thorough and generally well described. The
types of adjustments made to arrive at estimated savings are well reasoned and
explained. Impact evaluation of SEM programs invokes a methodology and
corresponding terminology that is unique to SEM and EMIS. Greater clarity when it
comes to defining these terms and their values by project would be helpful.
At times the SEM Impact Evaluation discussion related to cumulative and
incremental savings, measure life and persistence of savings is hard to follow and
seemingly contradictory. For example, the explanation of cumulative savings in
paragraph three of section 16.1 (p 67) refers to the “… savings achieved with respect
to the baseline situation prior to the implementation of the first SEM activities…”. A
more accurate reference would have been to the projected baseline rather than the
pre-SEM baseline usage. Comparing current usage to pre-SEM usage would not
yield savings unless the baseline is projected to reflect current period operating
levels. Also, consider the last bullet under Desk Reviews (p 68) that says
Recommendation 2016 SEM-SV-1: Add the following SEM related terms to the Definitions (p. iii); Baseline regression, Projected baseline, Adjusted baseline, Cumulative SEM/EMIS savings, Incremental SEM/EMIS savings. The SEM/EMIS definitions for cumulative and incremental savings should reference the relationship between reported energy usage and the projected baseline.
Recommendation 2016 SEM-SV-2: We support the Service Provider recommendation of a longer engagement with SEM participants.
Verification Review of Program Year 2016 Evaluation Results
51 | P a g e
incremental savings were arrived at by comparing energy consumption in 2016 with
energy consumption in 2015. The difference between these two values would not
produce incremental savings unless 2015 is adjusted to reflect non-program changes
since 2015 such as operating levels.
13) Small Business Energy Solutions
The Direct Installation program has only one component, Small Business Energy
Solutions (SBES). SBES provides two paths: Do it yourself (DIY) path and an audit
path. The audit path provides turnkey implementation, beginning with a free on-site
energy assessment for small businesses. There is a substantial “buy-down” of full
installation cost and many kinds of energy-efficient measures are included in the
program. Service for common areas of the Rental Properties and Condos (RP&C)
program is included in SBES. For 2016, there were twenty-nine participants on the
audit path and 230 projects on the DIY path.
The range of measures offered through SBES is extensive and includes HVAC
measures, commercial lighting measures, commercial kitchen measures, water-
heating measures, and commercial laundry measures.
The Evaluator conducted a program documentation review, and interview with the
program manager, five interviews with auditor agencies, six interviews with program
contractors and twelve interviews with participants who started the program but left
before completing it. There was also a survey of thirty-seven participants. There
were five on-site visits to observe audits and fifty-one projects received desk reviews
Recommendation 2016 SEM-SV-3: One way to more clearly communicate the approach is to show the results for each step and each project. Having a table with columns for 2016 Projected baseline, Adjustments, Adjusted baseline, Actual energy usage and Incremental savings would more clearly show the methodology and results. More columns may be required to show certain details such as the adjustment for savings from other ECM programs. Presumably such a table would also make clear how the value of 0.463 “Adjustment Ratio for Energy Savings” was calculated.
Recommendation 2016 SEM-SV-4 (Edits/Corrections): On P. 55, Figure 11, should the 2015 participant bar be 6 and not 9? On P. 64, third bullet, should the bullet read “Three respondents…” instead of “Two respondents…” See Table 24, p 62.
Verification Review of Program Year 2016 Evaluation Results
52 | P a g e
plus on-site visits (10 on the audit path and forty-one on the DIY path). Both free-
ridership and internal spillover were assessed. The interviews with participants who
left the program enabled capture of savings implemented but not reported to the
program. Free-ridership was calculated separately for the audit path and for the DIY
path. Interactive effects were considered.
The evaluator performed a thorough documentation review and the measure-level
savings calculations are consistent with better-industry practices.
The evaluator provides five recommendations for SBES. We support all five.
2016 SBES-R1. Assist participants throughout the participation process, from enrollment to completion, to minimize the number of participants who do not complete projects and encourage equipment installation. Participants receiving no assistance from contractors throughout the participation process face considerable barriers to implementing projects, including making decisions without technical assistance and spending time seeking bids. Without the assistance of contractors, participants do not make their project a priority, thus leading to potential delays in or failure to complete projects. Any actions ENS can take to connect contractors with participants and remind them that the program component offer is still available should yield more implemented projects. Some examples illustrating how other program administrators have taken meaningful actions to support participants include: randomly assigning contractors to participants; adding general contractors to their trade networks to ensure participants have access to contractors that can manage entire projects; and having the audit report specify a list of contractors who can address each identified measure.
2016 SBES-R2. Engage, support, and provide resources to help contractors promote SBES among potential participants. The highest implementation rates have been so far achieved by contractor-assisted DIY projects. All efforts made by ENS to develop and strengthen its relationships with contractors can potentially stimulate more occurrences of contractor-assisted DIY projects and enable completing more projects. Some examples illustrating how other program administrators have taken meaningful actions to support contractors include: providing preprinted marketing materials; offering cooperative marketing support; offering sales training to contractors; offering a program-specific training session once a year; and conducting outreach to contractors about the program using mailings, phone calls, or by hosting contractor breakfasts throughout the province.
Verification Review of Program Year 2016 Evaluation Results
53 | P a g e
2016 SBES-R3. Consider working with auditors to make the process of developing cost estimates less burdensome for auditors, contractors, and distributors. Currently, the audit path requires auditors to obtain new cost estimates for each project. Auditors call distributors and contractors who then voluntarily provide bids for the identified measures of a project. These contractors and distributors are unlikely to obtain the project simply because the auditor contacted them, partly because the auditor cannot recommend these contractors to participants. To address this issue, ENS could meet with auditors to discuss alternative methods for estimating costs.
2015 SBES-R4. Improve the application forms and the internal review procedure to ensure the accuracy of savings calculations for DIY projects. The Evaluator found that the variables used in the savings calculation algorithm were often not representative of the actual equipment and operating conditions observed on site. The hours of use for lighting measures were the variables that required an adjustment most often, and those adjustments were generally significant. The Evaluator believes that requesting more information in the application forms would at least partially address this issue; operation schedules should be provided for each measure in each area of the facility. The Evaluator found multiple cases where the hours of use entered in the tracking sheet were the same for all the measure, while the real hours of use varied throughout the facility. Special attention should also be paid to validating the quantities of fixtures and lamps installed; currently, most invoices do not include sufficient information needed for validating the quantities claimed for each project.
2015 SBES-R5. Identify those SBEAs who are qualified to conduct audits at industrial facilities. The on-site visits conducted during energy audits showed that SBEAs possess varying skill levels. While all SBEAs conducted the audit in a professional manner, one SBEA mentioned that he did not have the technical background needed for evaluating energy-saving opportunities at an industrial facility. Industrial facilities require specific technical knowledge. Therefore, ENS may want to identify those SBEAs who are qualified for making industrial facility assessments and assign them to industrial-sector participants.
Verification Review of Program Year 2016 Evaluation Results
54 | P a g e
14) Regulation: Codes and Standards
In Nova Scotia, for codes and standards17 impact evaluation results for energy
savings and demand reduction are not attributed (in whole or in part) to the DSM
Program Administrator. However, each year, the evaluator is required to estimate
energy savings and demand reduction impacts from codes and standards, separately
from the DSM Administrator’s program impacts.
1. Minimum Energy Performance Standards (MEPs)
Codes and standards are developed through ongoing technical, deliberative and
consultative effort led by the federal and provincial governments, typically with the
extensive participation of parties. This effort is supported by quantitative analysis,
several technical reviews and discussions. The results of this process are
improvement of current minimum energy performance standards (MEPS) and the
development and enactment of new MEPS.
MEPs have at least three interesting characteristics:
• Lags: An aspect of MEPs is that although a new or improved MEP may go
into legal effect on a certain date, when changes in codes and standards are
introduced there is typically a time lag for the change to become fully
embodied in actual practice.18 Sometimes this is treated as an expected
feature of a MEP, and planned for. Other times this is viewed as a kind of
non-compliance with a MEP.
• Market shaping: MEPs are powerful because they modify whole markets.
Each advance in regulation improves the state of technology-in-use and
corrects market failures (to improve energy efficiency) so that the shape of a
whole market is changed. Traditional market analysis and DSM programs
then take place within the shape of the market as established by MEPs.
17 A code is a specification of policy; a standard is a technical standard (the “how to” for implementing
one or more elements of a code).
18 For buildings, it takes time for builders and workers to become comfortable and skilled with new
approaches. For appliances or lighting or other unitary measures, it takes time to clear stock from
supply channels and to make changes in supply chains. Typical lags can often be on the order of a
year or more.
Verification Review of Program Year 2016 Evaluation Results
55 | P a g e
• Reliability: MEPs are also powerful because they have extensive long-term
reliability. Typically, by means of MEPs, the energy efficiency aspect of a
product is improved going forward. This improvement at the bottom level of
technology remains until the technology is replaced by something even more
energy efficient.
2. MEPs and DSM Programs
The energy savings and demand reduction impacts of codes and standards and the
energy savings and demand reduction effects of Efficiency Nova Scotia’s Demand-
Side Management (DSM) programs are currently treated as mutually exclusive:
• Efficiency Nova Scotia’s DSM programs improve energy savings through
program implementation to raise performance for program participants beyond
the currently operative baseline level. This is energy savings/demand
reduction achieved beyond energy savings/demand reduction caused by
MEPs.
• In contrast, MEPs work to improve energy savings/demand reduction for
everyone by requiring better baseline performance levels.
Though the impacts of MEPs are outside the DSM Administrator’s portfolio, MEPs
and DSM programs are both competitive and complementary:
• DSM programs and MEPS are competitive in the sense that each incremental
improvement in baseline performance set by codes and standards raises the
bar for program work. The gradual improvement of existing MEPs and the
introduction of new MEPs slowly reduces the amount of savings available from
DSM programs and reduces their cost-effectiveness. To maintain cost-
effectiveness, energy efficiency programs and measures must be continually
enhanced.
• DSM programs and MEPs are complementary when implemented together,
either directly (as pioneered by BC Hydro in the 1980s or indirectly. As DSM
programs improve practice, parties become increasingly familiar, comfortable
and confident with higher practice standards and it becomes increasingly likely
that parties will reach agreement on improved MEPs. In some jurisdictions,
this is accomplished through programs conducted by the DSM Administrator;
in others, it simply happens without much coordination or direction. These
Verification Review of Program Year 2016 Evaluation Results
56 | P a g e
steps are then iterated into the future. In this complementary perspective, the
high-level strategy is to use programs to improve practices and performance;
then to incrementally lock-in higher performance levels with MEPs.19
3. Five MEPs Areas
As part of Canada’s Clean Air Regulatory Agenda established in 2006, amendments to the Energy Efficiency Regulations have been regularly planned to deliver energy, greenhouse gas and air pollutant reductions.20 In addition, Nova Scotia has decided to apply MEPS to street lights and to enforce an Energy Code applicable to houses, small buildings, large buildings, condos and apartments.21 The calendar year 2016 evaluation is focused in five areas:
• Incandescent Lamps (75 W and 100 W)
• Incandescent Lamps (40 W and 60 W)
• LED street lights
• New building energy code for houses and small buildings, and
• New building energy code for large buildings, condos and apartments.
4. Evaluation Approach & Method
The method used for evaluation of codes and standards in the Econoler evaluation is
analysis of secondary data (primarily official and/or business statistical series). This
focused quantitative work is interpreted using additional qualitative information
gathered by conducting interviews and participating in discussions with members of
the relevant industry associations, government and municipality representatives and
19 Following Campbell, this mechanism of regulation through the “administrative state” is how we
socially and incrementally build intelligence into coordinated human activity. If we build intelligence
increasingly into our common practice, we may survive and do well. If we fail to build intelligence into
common work, then we fail to thrive and become diminished.
20 MEPs tie energy efficiency advances to Canada’s Clean Air Regulatory Agenda. See P. vii & P. 1
of Econoler, Introduction of New Codes and Standards in Nova Scotia, 2016 DSM Evaluation, Final
Report, March 8, 2017.
21 Nova Scotia is also the first province in Canada to make light emitting diode (LED) roadway lights a
minimum requirement. Econoler, Introduction of New Codes and Standards in Nova Scotia, 2016
DSM Evaluation, P. viii & P. 2.
Verification Review of Program Year 2016 Evaluation Results
57 | P a g e
selected retailers. Econoler’s methodology for MEPs is shown in Error! Reference s
ource not found..22
Figure 14: Methodology Employed by Econoler for MEPs.
For each of the five regulations, Econoler develops three major parameters:
• Market size: The market size for each product category;
• Portion of market impacted by the regulation: The proportion of the
market on which the regulations have had an impact;
• Unitary Savings: Unitary savings values for each regulation, i.e., the
difference between the baseline consumption (absent the regulation) and the
consumption of products that meet the regulation.
The primary sources of information used were Statistics Canada (building permit
data), Canada Mortgage and Housing Corporation (building statistics), Nova Scotia
Municipalities (street lighting stock data) and industry trade associations. In addition
to relevant data series, MEP analysis wide and deep knowledge of the workings and
context of each specific technology market and of the specific technologies. The
purpose of the evaluation is to develop the savings and demand reduction (gross and
net) due to the MEPs. Since shipment data is available on an Atlantic Canada basis,
ratios were developed using provincial population data from Statistics Canada.
Street lighting data was available within the province. The evaluator endeavored to
22 Error! Reference source not found. reproduces Figure 1, P. 3 in Econoler, Introduction of New C
odes and Standards in Nova Scotia, 2016 DSM Evaluation.
Verification Review of Program Year 2016 Evaluation Results
58 | P a g e
insure there was no overlap or double counting in estimating impacts. Interactive
effects were accounted for in the evaluation method.
A non-compliance ratio was developed for houses. However, Naturally Occurring
Market Adoption factors (NOMADs) do not seem to be fully developed for this
evaluation. The performance of a new MEP should, at least theoretically, be
systematically discounted for NOMADs. The NOMADs calculation reflects the fact
that some installations would have occurred in the absence of the MEP (though this
factor may be negligible for some MEPs, in other cases it may be substantial).
Other NOMADs factors include the learning curve for personnel and the ongoing
development of the pattern of engagement and disengagement of agencies and
trade allies. This includes funding (or not funding) required staffing levels for data
gathering on compliance. Also, it includes inspection priorities (inspection agencies
are typically understaffed so that it is not possible to meet all inspection
requirements) as well as direct education efforts. The NOMAD calculation for MEPs
is like the free ridership calculation for DSM programs.23
The evaluation, by addressing the portion of market impacted by each MEP develops
both gross and net impacts.
5. Attribution
Though not required for Nova Scotia, after the determination of gross and net energy
savings and demand reduction from MEPs, in some other jurisdictions24 there is a
23 Chappell, Codes and Standards Program Attribution – Potential Options. PowerPoint for the
Midwest Energy Efficiency Alliance’s Midwest Building Energy Codes Conference, October 3, 2010, P.
4. (http://mwalliance.org/sites/default/files/uploads/meeaconference/MES-
2013_presentations_Chappell.pdf.
24 For example, British Columbia, California, Massachusetts, Arizona, New York, Vermont and Maine
as well as in the US Pacific Northwest where the Northwest Energy Efficiency Alliance (NEEA) carries
out this function for the DSM Administrators. See: Lee, Allen, Energy-efficiency Building Codes and
Appliance Standards (C&S), http://www.cadmusgroup.com/wp-content/uploads/2012/11/CS-Article-
for-Strategies-Allen-Lee.pdf.
Recommendation 2016 SV-MEPS-1: For future evaluations, please systematically address what savings would have resulted naturally via Naturally Occurring Market Adoption Factors (NOMADs) for each MEP, and decrease impacts by those amounts for each MEP.
Verification Review of Program Year 2016 Evaluation Results
59 | P a g e
standard next evaluation step: the attribution of some portion of the impact results to
the DSM Administrator. This is analogous to the process analyzing net-to-gross for
DSM programs.25 The specific evaluation question for this step is: “What percentage
and amount of the net energy savings brought about by the MEP is due to the work
of the DSM Administrator?”
Nova Scotia does not attribute a portion of savings from MEPs to the DSM
Administrator, in part because MEPs come from the federal and/or provincial level.
However, the Nova Scotia DSM Administrator is actively supporting the MEPs,
through codes and standards efforts. Also, the resource acquisition programs help
move markets and build an environment for moving MEPs forward.
MEPs programs can be supported by DSM programs. There are two types of MEP
supportive DSM program.26 Efficiency Nova Scotia is involved in the first type.
• Codes and standards programs develop MEPs. Efficiency Nova Scotia is
involved in this effort indirectly by running its programs and marketing and
education efforts. This work advances the level of technology in practice,
working with trade allies improve field performance in meeting and going
beyond code, higher minimum performance is achieved.27 Viewed broadly,
these are parts of the ongoing MEPs process. The DSM Administrator is also
involved directly. Efficiency Nova Scotia has an active seat on the Canadian
Standard Association’s (CSA) Steering Committee on Performance Energy
Efficiency and Renewables (SCOPEER) which sets strategic direction for EE
standard development in Canada. In addition, ENS has a seat on the Strategic
25 Chappell, Codes and Standards Program Attribution – Potential Options. PowerPoint for the
Midwest Energy Efficiency Alliance’s Midwest Building Energy Codes Conference, October 3, 2010, P.
3. (http://mwalliance.org/sites/default/files/uploads/meeaconference/MES-
2013_presentations_Chappell.pdf.
26 Protocol on Codes and Standards and Compliance Enhancement Evaluation, Pp. 81-104 in
TecMarket Works, California Energy Efficiency Evaluation Protocols: Technical, Methodological, and
Reporting Requirements for Evaluation Professionals, April 2006. See P. 81.
http://www.calmac.org/events/EvaluatorsProtocols_Final_AdoptedviaRuling_06-19-2006.pdf.
27 This ongoing effort also involves the Efficiency Nova Scotia marketing functions, considered as a
totality. DSM program marketing also promotes general awareness, understanding and effort for
energy efficiency. The annual Bright Business conference and the program of annual efficiency
awards promotes positive thinking regarding energy efficiency in the business community. All of this
effort, as a totality, also has an impact for the MEPs process.
Verification Review of Program Year 2016 Evaluation Results
60 | P a g e
Resource Task Force (SRTF) at CSA. This is the group of members for
SCOPEER. Some technical staff also serve on more specific technical
committees to support specific standard development. In addition, ENS works
with both the Provincial and Federal governments to assist in adopting CSA
standards as MEPS. At the Provincial level, ENS has a moderate degree of
influence, while at the national level ENS is consulted from time to time as a
stakeholder. Efficiency Nova Scotia sends staff to MEPs meetings on a
frequent basis at the Provincial level – and, sporadic opportunities exist at the
Federal level. ENS attends CSA meetings on a regular basis. Efficiency Nova
Scotia estimates that about 0.05 FTEs are dedicated to Codes and Standards
work across the organization.28
• Compliance programs are programs directed towards enhancing
compliance with MEPs. The initial impact of a new MEP can be quite low due
to an expected non-compliance lag until practice is brought up to the new
performance level. To address this problem, some DSM Administrators run
code enhancement programs which may involve training for trade allies,
collecting information on compliances, placing DSM Administrator personnel in
county-level code enforcement departments and providing supplementary
funding for county code enforcement. Efficiency Nova Scotia is not currently
conducting this type of program.
6. Reliability and Duration of MEPs Impacts
Physical deterioration of measures occurs in the same manner as with specific DSM
program measures; the rate of attrition of savings may be similar or different
depending on the specific technologies.
Measure life may be affected by changes in materials required to meet MEPs. For
example, substituting plastic parts for metal parts in appliances tends to shorten their
operating life. This may or may not be correlated with increased levels of energy
efficiency. If it is associated with attaining energy efficiency, the evaluator should
take decreased service life into account.
28 This activity specification was provided by Lauralee Sims, Acting Efficiency Nova Scotia Evaluation
Manager to Gil Peach, via e-mail on March 29, 2017.
Recommendation 2016 SV-MEPS-2: In the next evaluation, the evaluator should carry out the attribution step for each MEP.
Verification Review of Program Year 2016 Evaluation Results
61 | P a g e
In contrast to a DSM program, a MEP (usually with a lag) eventually creates a new
baseline for the covered technology. This means that new units are constantly being
installed into the future at no additional cost to the DSM Administrator. When a MEP
is successful, any replacement of a unit at any time incorporates the MEP level of
efficiency or better.
The Savings Verification study supports the following Econoler recommendation for
the program year 2017 evaluation, since it is reasonable to expect a lag.
The methods adopted and analysis developed in this evaluation are within the scope
of an experienced and professional independent evaluator. They generally follow the
relevant parts of the Codes and Standards and Compliance Enhancement Protocol
of the California Public Utilities Commission29 and the Savings & Evaluation
Methodology for Codes and Standards Initiative submitted to the Massachusetts
Department of Energy Resources on behalf of the Massachusetts Program
Administrators.30 These documents should be consulted in developing the
evaluation for program year 2017.
29 The California protocol was the first evaluation protocol for codes and standards work. Much of the
protocol is written for the context of the California institutional framework, but the core technical
elements remain relevant for evaluation. TecMarket Works, California Energy Efficiency Evaluation
Protocols: Technical, Methodological, and Reporting Requirements for Evaluation Professionals, April
2006, http://www.calmac.org/events/EvaluatorsProtocols_Final_AdoptedviaRuling_06-19-2006.pdf.
For the Codes and Standards and Compliance Enhancement Evaluation Protocol, see Pp. 81-104.
30 Mass Save, Savings & Evaluation Methodology for Codes and Standards Initiative, Submitted to
Massachusetts Department of Energy Resources on behalf of Massachusetts Program Administrators,
October 20, 2015. (http://ma-eeac.org/wordpress/wp-content/uploads/Savings-Evaluation-
Methodology-for-Codes-and-Standards-Initiative.pdf)
Recommendation 2016 SV-MEPS-3: Focus the 2017 evaluation on the same standards as evaluated in 2016. While the new standards set forth in Amendment 13 will come into force at the end of June 2017, their impact may be limited until January 2018, since retailers need a few months to sell their existing stocks.
Verification Review of Program Year 2016 Evaluation Results
62 | P a g e
XI. Looking toward the Future of Evaluation
The section of the report provides brief suggestions for possible consideration for
future evaluations. These proposals would like involve consideration by ENS, the
Evaluator and the Advisory Group and are meant to provide suggestions for possible
consideration.
We could incorporate sensitivity to the climate adaptation dimension in the
intersection between DSM and climate change to better understand and improve the
long-term value of programs.
• Presentation of the Time Dimension of Energy Savings. It would be good
to treat the time dimension more deeply.31 This would improve ability to
compare measures and programs. Evaluations currently work with one year
of actual data plus a baseline year.32 Comparisons using first year savings
include only a small part of program impact and do not account for the
duration of energy savings which varies considerably among measures and
among programs. It would be useful to see average savings-weighted
measure life and the same comparison for program groups, the BNI and
Residential portfolios and at the overall portfolio level. A technical brief from
Lawrence Berkeley Laboratory (LBL)33 supports this perspective:
31 Going deep is a reference to Daniel Kahneman, Thinking Fast and Slow. New York: Farrar, Straus
and Girox, 2011. Fast thinking is easy and immediate, often reflexive. it is typical of what we do in
everyday life. Slow thinking is hard thinking – it goes deeper. Here we suggest a deeper
incorporation of time.
32 Also, it is important to remember that the first-year impacts are usually projections, since
installations take place over the course of each year while results are reported as first-year results.
33 Hoffman, Ian M., Steven R. Schiller, Annika Todd, Megan A. Billingsley, Charles A. Goldman, Lisa
C. Schwartz, "Energy Savings Lifetimes and Persistence: Practices, Issues and Data." Berkeley,
California: Lawrence Berkeley National Laboratory, Electricity Markets & Policy Group Technical
(hereinafter “LBL Technical Brief”), P. 1.
Understanding how long measures, programs and portfolios last
(lifetimes) – and the degree to which savings change over time
(persistence) – is critical to estimating the benefits of efficiency,
calculating cost effectiveness and prioritizing long-term versus short-
term efficiency actions.
Verification Review of Program Year 2016 Evaluation Results
63 | P a g e
When savings life is not included, if a reader wants to make comparisons, the
comparisons are of entities that are fully developed because savings life is not
included.34 The cost of fuller treatment of the time dimension would likely be
quite small.
Persistence Evaluations. Periodic persistence evaluations for selected
measures and/or programs could be introduced to “true-up” planning
assumptions and savings estimation using empirical measurement.35 The
cost would need to be discussed with the Evaluator.
• Market Transformation Evaluation. Portions of selected evaluations could
be approached from a market transformation evaluation perspective to specify
adoption curves (S-curves) and for use to indicate when to withdraw or revise
activity in markets. Market transformation is a unique program type36 that
requires a special evaluation approach. Primary features are strategic intent
to transform a specific market; working with the complexity of and dynamics of
markets; that market transformation can take some years to reach payback
shows a different pattern of payback than a resource acquisition program.
34 This is sometimes called the error of misplaced comparison. See table, Threats to Usable
Knowledge, Figure 3, P. 318 in Dunn, William N., “Reforms as Arguments”, Pp. 293-326 in
Knowledge: Creation/Diffusion/Utilization, Vol. 3, No. 3, March 1982.
35 There are few studies that go back years after a program installation year to assess persistence of
measures and energy savings. On such study is: Peach, H. Gil, Anne Minor West, Howard S.
Reichmuth & Pamela Brandis, “Seven Years After: Impact Evaluation Results Employing
Extensive Site Inspection Data and Associated Pre/Post Billing Analysis,” Pp. 3.97-3.104 in the
Proceedings of the 1996 ACEEE Summer Study on Energy Efficiency in Buildings, Panel 3,
Residential Programs: Program Evaluation. The general finding in this study was that savings from
residential weatherization continued unless there was fuel switching in the home or a major change in
the use of space in the home (for example, addition of a room or remodeling).
36 For an overview of the market transformation approach, see Chapter 5: A Market Transformation
Perspective, in International Energy Agency, Creating Markets for Energy Technologies. Paris:
Organization for Economic Cooperation and Development/International Energy Agency, 2003. Also,
see: Distinction between Resource Acquisition Programs and Market Transformation Initiatives in
Prahl, Ralph and Ken Keating, Building a Policy Framework to Support Energy Efficiency Market
Transformation in California. California Public Utility Commission, December 9, 2014, Table 1, P. 12.
Verification Review of Program Year 2016 Evaluation Results
64 | P a g e
A workable definition of market transformation is provided by the Northwest
Energy Efficiency Alliance, NEEA (text box):37
In a few areas, the 2016 evaluations are moving towards market
transformation and push the structural limitations of market evaluation.38 In
these areas, the market evaluations are struggling to articulate market
transformation within an evaluation format that is not complex enough to
accommodate that expression. Currently there is no accepted market
transformation evaluation protocol. However, the California general evaluation
protocols attempt to include market transformation in the market effect
protocol.39
Market transformation analysis uses the ideal S-curve for innovation and
diffusion. In Figure 15 the (red) S-curve indicates rising market share.40
Actual curves using empirical data may not look like the ideal-typical S-curve.
37 “NEEA’s definition of market transformation,” undated. https://neea.org/docs/default-
source/marketing-tookits/neea_definition_of_markettransformation.pdf?sfvrsn=2
38 See 2016 Instant SV-R1 for A-Type LEDs, 2016 Recommendation Green Heat SV-R1 for high
efficiency mini-spilt heat pumps and 2016 Recommendation Business Energy Rebates SV-R1 for LED
measures. Lighting MEPs might also be treated in an empirical market transformation approach.
39 Hall, et al., California Energy Efficiency Evaluation Protocols: Technical, Methodological and
Reporting Requirements for Evaluation Professionals. Oregon, Wisconsin: TecMarket Works, April
2006, Pp. 143-162. (http://www.calmac.org/events/EvaluatorsProtocols_Final_AdoptedviaRuling_06-
19-2006.pdf).
40 Figure 15 is reproduced from Figure 2, P. 17 in NMR Group, Inc., A Review of Effective Practices
for the Planning, Design, Implementation and Evaluation of Market Transformation Efforts.
Somerville, Massachusetts, Nov. 25, 2013, CALMAC Study ID PGE0330.01, Submitted to Pacific Gas
& Electric, San Diego Gas & Electric, Southern California Edison, and Southern California Gas. The
figure is based on E.M. Rogers, Diffusion of Innovations (5th edition). New York: Free Press, 2003.
“Market Transformation is the strategic process of intervening in a
market to create lasting change in market behavior by removing
identified barriers or exploiting opportunities to accelerate the adoption
of all cost-effective energy efficiency as a matter of standard practice.”
- NEEA
Verification Review of Program Year 2016 Evaluation Results
65 | P a g e
Figure 15: Ideal-Typical S-curve.
A more sophisticated S-curve has been developed by NEEA (Figure 16).41
Figure 16: S-Curve. (NEEA)
41 “NEEA’s Definition of Market Transformation”, undated. (http://neea.org/docs/default-
source/marketing-tookits/neea_definition_of_markettransformation.pdf?sfvrsn=2) Sourced 05-31-
2017.
Verification Review of Program Year 2016 Evaluation Results
66 | P a g e
Figure 15 includes the “S” shape (the light blue band of market transformation,
plus the baseline condition and the ultimate “lock-in” with codes and standards
(red lock). Investment is the (dark blue) line that declines to zero as
production increases.
The nature of the process is indicated by NEEA (Figure 17).42 It is the
iteration of this sequence that leads to a set linked S-curves (each component
curve like the S-curve in Figure 14) under an envelope curve.
Figure 17: Iterative Sequence of Transformation Steps. (NEEA)
The (dark blue) curve in Figure 16 is a declining cost curve. The declining
cost curve is further developed in Figure 18 using data from an example
program. Figure 18 is a “declining cost per unit kWh” curve from an early
manufactured housing market transformation study. Three versions of the
curve are shown, each calculated by a different independent evaluator (RER,
PNL and Ecotope).
Before market transformation was fully understood, this project was initiated in
the US Pacific Northwest as a resource acquisition project. Regional utilities
in cooperation with the Bonneville Power Administration pooled funds to buy
down to zero the difference between an energy efficient manufactured home
and a common practice manufactured home (a 100% buydown). The program
was cost-effective using the Total Resource Cost (TRC) test, considered
solely from the perspective of direct resource acquisition. After a few years of
cooperation with several manufactured housing corporations, the utilities
suddenly canceled the buy down (illustrated as the vertical line on the graph in
42 NEEA, Op. cit.
Verification Review of Program Year 2016 Evaluation Results
67 | P a g e
Figure 18). Note that the cost curve was already downward sloping in the
resource acquisition phase before reaching this line.
Moving to the right of the vertical line, there are no more incentive payments
but the most of the factories, having been retooled and stocked with supplies
for the energy efficient units, continued to produce efficient units as a new
normal practice. In some cases, the old baseline common practice units were
also using some efficient parts, such as windows, because given fixed factory
floor space it was more practical to stock one set of windows rather than two.
Figure 18Error! Reference source not found. shows ongoing benefit due to
change in market structure after ending the incentive. It anticipates the kind
evaluation approach that might be useful in Nova Scotia as some measures
reach toward market transformation and marketing and promotion can
change.43 A market transformation approach to evaluation might be
considered for portions of selected future evaluations. Cost would need to be
discussed with the Evaluator.
Figure 18: Benefits captured in Market Transformation Evaluation.
43 Peach, H. Gil, C. Eric Bonnyman, Anne West, and Agneta Persson, Pacific Northwest Energy-
Efficient Manufactured Housing: A Time of Paradigm Shift and Transition. Beaverton, Oregon: H. Gil
Peach & Associates/Scan America®, 1997, Monograph 97-9-1. The situation is more complex than
shown due to income differences. The manufactured housing market has a high-end, a middle and a
low-end segment. The low-income segment likely requires continued subsidy while the high-end can
promote itself and the middle segment likely requires a strong marketing and promotional effort.
Verification Review of Program Year 2016 Evaluation Results
68 | P a g e
• Climate Mitigation. Climate mitigation by is an effect of DSM programs. It
might make sense to require all DSM evaluations to include an accounting of
these “non-utility” benefits. The effort for including this piece in program
evaluations simply involves multiplying and reporting. The cost would need to
be discussed with the Evaluator.
• Climate Adaptation. In Europe, DSM is largely driven by climate concerns.
The intersection of climate and DSM is also recognized in Canada. While
climate mitigation helps motivate DSM and is conventionally reported, we do
not have climate adaptation within our currently defined scope (see yellow text
box on P. 1). Given that climate change is now experienced in Nova Scotia
each year, it may be worth considering how climate work and DSM intersect,
and the degree to which climate adaptation could figure into DSM programs
and evaluation.
The intersection of DSM and climate adaptation is evolving. Currently, for
most DSM programs, the climate trend appears not to be important in the
short-range, due to the relatively pace of advance of climate change in relation
to the average life of most DSM measures. However, the advance of climate
change involves the dialectical process called “the transformation of quantity
into quality”, so is inherently a tricky area. Right now, DSM work and climate
adaptation work are essentially in parallel silos, yet their evolving intersection
warrants discussion.
The exception among DSM programs is new construction, due to the long
actual life of buildings. In the DSM silo, we have one way of defining the life of
a new building; it is generally an underestimate but has been useful. In the
climate adaptation silo, a new building is viewed as the initial life plus the lives
from two additional building renovations. In this way, DSM might say the
building has a life of 40-60 years for calculation purposes, while a climate
adaptation analysis might assign a life of 200 or 300 years (in climate
adaptation analysis, building lives are physical though planned replacement
dates are used. ENS might consider a dialog with appropriate agencies of
provincial government and with climate adaptation advocacy organizations
with a view towards mutually defining the interaction of climate adaptation with
new construction and major renovation programs to see if there are strategic
insights to include within program designs and within evaluation.
Verification Review of Program Year 2016 Evaluation Results
69 | P a g e
XII. Summary of Findings
Finding 2016 General-SV-1: The evaluator’s approaches and analysis for all evaluations in Program Year 2016 are fully adequate and complete. (P. 20) Finding 2016 General-SV-2: The evaluator’s presentation format of the Program Year 2016 evaluation is fully adequate and complete. (P. 20) Finding 2016 General-SV-3: All the 2016 evaluations are within accepted industry protocols or a relevant evaluation framework where a protocol does not exist. (P. 20) Finding 2016 General-SV-4: ENS has stopped doing process evaluation in nine of
twelve program areas (Table 12). For three years in a row these programs will not
have process evaluations. ENS plans to drop five of twelve market evaluations for
the 2018 program year (Table 13). ENS plans to use reduced effort for three of
twelve programs for the 2017 program year and for five of twelve programs for the
2018 program year (Table 11). The market evaluation and impact evaluation
reductions do not affect program year 2016 evaluations. The 2016 evaluations are
affected by having no process evaluations. (P. 21)
Finding 2016 General-SV-5: When a program shifts from ENS control to reliance
upon markets, ENS loses some ability to insure quality. (P. 21)
XIII. Summary of General Recommendations Recommendation 2016 General-SV-1: The Savings Verification study recommends acceptance of the 2016 evaluation results for energy savings and for demand-reduction for all programs. (P. 22) Recommendation 2016 General-SV-2: In future years, ENS should require the
Evaluator to Include a copy of the operative evaluation plan within the overall
Executive Summary, supplemented by a discussion of why, on balance, selected
dropping of evaluation components or introduction of reduced rigor is considered
appropriate. In addition, in the individual program write-ups, dropped and reduced
evaluation components should be noted and the rationale for dropping or reducing
these evaluation components should be explained. (P. 22)
Recommendation 2016 General-SV-3: In future years, ENS should require the
Evaluator to provide a summary table of obtained confidence and precision levels
along with population and sample size for all calculations for which confidence and
precision levels were developed. Also, the Evaluator should state the overall targets
Verification Review of Program Year 2016 Evaluation Results
70 | P a g e
for confidence and precision for the evaluations, and provide a discussion of results
in the table. Any variation in the targets and any results that do not meet the targets
should be explained.
Recommendation 2016 General-SV-4: In future years, ENS should specify rules to
bound the dropping of evaluation components and to bound the reduction of program
impact evaluations. There should be a systematic understanding and articulation of
risks to evaluations producing usable and defensible knowledge. For example, a rule
could be that impact evaluation for a program will be reduced no more frequently
than every other year. Or that process evaluations will skip more than two years in a
row. It is quite possible to do some trade-offs to conserve dollars, with the approval
of the Advisory Group. But these kinds of trade-offs need to be carefully watched
and rules set up to protect the continuing validity and precision of evaluation results.
(P. 23)
Recommendation 2016 General-SV-5: ENS should increase on-site inspection for
quality control for all programs that shift from ENS control to reliance on markets.
(P. 24)
XIV. Summary of General Considerations Consideration 2016 General-SV-1: In addition to reporting first-year impacts, ENS
and the Advisory Group should consider the benefits and cost of also providing
savings estimates based on expected lifetimes to support comparisons among
programs and to improve foresight. (P.24)
Consideration 2016 General-SV-2: For program types that permit, ENS and the
Advisory Group should consider the potential benefits and costs of introducing
selected persistence evaluations to empirically document lifetimes and persistence of
savings. (P. 24)
Consideration 2016 General-SV-3: ENS and the Advisory Group should consider
the potential benefits and costs involved to employ a market transformation
framework. This would apply to a limited subset of programs/measures (see specific
program recommendations). (P. 25)
Consideration 2016 General-SV-4: ENS and the Advisory Group should consider
the potential benefits and costs involved in requiring the Evaluator to report non-utility
benefits for climate mitigation. (P. 25)
Verification Review of Program Year 2016 Evaluation Results
71 | P a g e
Consideration 2016 General-SV-5: ENS and the Advisory Group should consider
initiating discussion with the province and climate adaptation advocates of the
intersection between DSM and climate adaptation, particularly in new construction.
(P. 25)
XV. Summary of Individual Program Area Recommendations Recommendation 2016 Instant-SV-R1: For A-type LEDs, the 2017 Instant Savings Program Evaluation ENS should consider the costs and benefits of using a market transformation paradigm rather than a resource acquisition approach (including study of market effects). This would include development of the S-curve for the A-type LEDs and fully discuss market transformation program implications. The evaluation would also discuss monitoring A-type LEDs in future years to confirm the stability of a transformed market, and the attribution of subsequent market impacts. (P. 30) Recommendation 2016 Green Heat-SV-R1: In the 2017 evaluation for the Green
Heat high efficiency mini-split heat pump measure, ENS should consider the benefits
and costs of structuring the evaluation as a market transformation evaluation.
Continuing treatment as a resource acquisition evaluation with an expanded market
evaluation is stretching against the boundaries of that type of evaluation approach.
(P. 35)
Recommendation 2016 New Home Construction-SV-R1: ENS should consider meeting with the appropriate provincial agency and environmental advocates to discuss the evolving intersection of DSM with climate adaptation, particularly for new construction and major renovation projects such as Net Zero and Passivhaus projects. (P. 40) Recommendation 2016-Business Energy Rebates-SV-R1: ENS should consider the benefits and costs of structuring the 2017 evaluation for the LED measures for Business Energy Rebates as a market transformation evaluation rather than as a resource acquisition evaluation with a market evaluation. (P. 43) Recommendation 2016 Custom-SV-R1: ENS should consider meeting with the appropriate provincial agency and with climate adaptation advocates to explore the evolving overlap of DSM and climate adaptation and to develop a list of climate adaptation criteria for new construction and major renovation projects. (P. 46) Recommendation 2016 EMIS-SV-1: We agree with the Service Provider recommendation of greater coordination between the EMIS and SEM program. Further, we recommend that ENS consider merging the two programs into a single customer offer. While there are unique attributes of each that would need to be considered when combining SEM and EMIS, other utilities offer a single program for both. (P. 47)
Verification Review of Program Year 2016 Evaluation Results
72 | P a g e
Recommendation 2016 EMIS-SV-2: Add the following EMIS related terms to the Definitions (p. iii); Baseline regression, Projected baseline, Adjusted baseline, Cumulative EMIS/EMIS savings, Incremental EMIS/EMIS savings. The EMIS/EMIS definitions for cumulative and incremental savings should reference the relationship between reported energy usage and the projected baseline. (P. 47) Recommendation 2016 EMIS-SV-3: One way to more clearly communicate the approach is to show the results for each step and each project. Having a table with columns for 2016 Projected baseline, Adjustments, Adjusted baseline, Actual energy usage and Incremental savings would more clearly show the methodology and results. More columns may be required to show certain details such as the adjustment for savings from other ECM programs. Presumably such a table would also make clear how the value of 0.431 “Adjustment Ratio for Energy Savings” was calculated. (P. 48) Recommendation 2016 SEM-SV-1: Add the following SEM related terms to the Definitions (p. iii); Baseline regression, Projected baseline, Adjusted baseline, Cumulative SEM/EMIS savings, Incremental SEM/EMIS savings. The SEM/EMIS definitions for cumulative and incremental savings should reference the relationship between reported energy usage and the projected baseline. (P. 50) Recommendation 2016 SEM-SV-2: We support the Service Provider recommendation of a longer engagement with SEM participants. (P. 50) Recommendation 2016 SEM-SV-3: One way to more clearly communicate the approach is to show the results for each step and each project. Having a table with columns for 2016 Projected baseline, Adjustments, Adjusted baseline, Actual energy usage and Incremental savings would more clearly show the methodology and results. More columns may be required to show certain details such as the adjustment for savings from other ECM programs. Presumably such a table would also make clear how the value of 0.463 “Adjustment Ratio for Energy Savings” was calculated. (P. 51) Recommendation 2016 SEM-SV-4 (Edits/Corrections): On P. 55, Figure 11, should the 2015 participant bar be 6 and not 9? On P. 64, third bullet, should the bullet read “Three respondents…” instead of “Two respondents…” See Table 24, p 62. (P. 51) Recommendation 2016 MEPS-SV-1: For future evaluations, please systematically address what savings would have resulted naturally via Naturally Occurring Market Adoption Factors (NOMADs) for each MEP, and decrease impacts by those amounts for each MEP. (P. 58) Recommendation 2016 MEPS-SV-2: In the next evaluation, the evaluator should carry out the attribution step for each MEP. (P. 60)
Verification Review of Program Year 2016 Evaluation Results
73 | P a g e
Recommendation 2016 MEPS-SV-3: Focus the 2017 evaluation on the same standards as evaluated in 2016. While the new standards set forth in Amendment 13. will come into force at the end of June 2017, their impact may be limited until January 2018, since retailers need a few months to sell their existing stocks. (P. 61)
Verification Review of Program Year 2016 Evaluation Results
74 | P a g e
XVI. References
Abt Associates, Climate Adaptation: The State of Practice in US. Communities,
November 2016. (http://kresge.org/sites/default/files/library/climate-adaptation-the-
state-of-practice-in-us-communities-full-
report.pdfhttp://kresge.org/sites/default/files/library/climate-adaptation-the-state-of-
practice-in-us-communities-full-report.pdf)
Ackerman, Frank, Can We Afford the Future, The Economics of a Warming World.
London & New York: Zed Books, 2009, P. 18.
Bender, Tom, Learning to Count What Really Counts, The Economics of Wholeness.
Manzanita, Oregon: Fire River Press, 2002.
Chappell, Codes and Standards Program Attribution – Potential Options. PowerPoint
for the Midwest Energy Efficiency Alliance’s Midwest Building Energy Codes
Conference, October 3, 2010, P. 4.
(http://mwalliance.org/sites/default/files/uploads/meeaconference/MES-
2013_presentations_Chappell.pdf).
Cribb, Julian, Surviving the 21st Century, Humanity’s Ten Great Challenges and How
We Can Overcome Them. Springer International Publishing AG, Switzerland, 2017.
Dekker, Sidney, Drift Into Failure, From Hunting Broken Components to
Understanding Complex Systems. Surry, England & Burlington, Vermont: Ashgate,
2011.
Dunn, William N., “Reforms as Arguments”, Pp. 293-326 in Knowledge:
Creation/Diffusion/Utilization, Vol. 3, No. 3, March 1982.
Econoler, et al., 2016 DSM Evaluation Reports; Efficiency Nova Scotia Corporation.
Galbraith, John Kenneth, A Theory of Price Control. Cambridge, Massachusetts:
Harvard University Press, 1952.
Gellings, Clark W. & John H. Chamberlin, Demand-Side Management Planning
Concepts & Methods, Second Edition, Liburn Georgia: The Fairmont Press, 1992.
Verification Review of Program Year 2016 Evaluation Results
75 | P a g e
Gellings, Clark W. & John H. Chamberlin, Demand-Side Management Planning,
Liburn, Georgia: The Fairmont Press, 1993.
Georgescu-Rogen, The Entropy Law and the Economic Process. Cambridge,
Massachusetts & London, England: Harvard University Press,1971, 1999.
Ghosh, Amitav, The Great Derangement, Climate Change and the Unthinkable.
Chicago and London: The University of Chicago Press, 2016.
Granderson, J., Price, P., Jump, D., Addy, N., Sohn, M. (2015). Automated
Measurement and Verification: Performance of Public Domain Whole-Building
Electric Baseline Models. Applied Energy 144:106-13 (http://eis.lbl.gov/pubs/lbnl-
187596.pdf).
Göransson, Christina and Sven Faugert, Effective Market Influence, an effect chain
analysis of NUTEK’s high frequency lighting campaign. Stockholm: NUTEK, 1994,
No. R1994-70.
Hoffman, Ian M., Steven R. Schiller, Annika Todd, Megan A. Billingsley, Charles A.
Goldman, Lisa C. Schwartz, "Energy Savings Lifetimes and Persistence: Practices,
Issues and Data." Berkeley, California: Lawrence Berkeley National Laboratory,
Electricity Markets & Policy Group Technical Report, May 2015.
International Energy Agency, Creating Markets for Energy Technologies. Paris:
Organization for Economic Cooperation and Development/International Energy
Agency, 2003.
Kahneman, Daniel, Thinking Fast and Slow. New York: Farrer, Strauss & Giroux,
2011.
Keating, Kenneth M., Ruth L. Love, Terry V. Oliver, H. Gil Peach & Cynthia B. Flynn,
“The Hood River Project; Take a Walk on the Applied Side,” Pp. 112-118, The Rural
Sociologist, Vol.5, No. 2, May 1985.
Kolbert, Ellizabeth, The Sixth Extinction, An Unnatural History. New York: Henry Holt
& Company, 2014.
Lawrence Berkeley Laboratory, Building Energy Information Systems and
Performance Monitoring Tools, 2014-2016 Assessment of Automated M&V Methods,
Summary of Work (http://eis.lbl.gov/auto-mv.html).
Verification Review of Program Year 2016 Evaluation Results
76 | P a g e
Leakey, Richard E. & Roger Lewins, The Sixth Extinction, Patterns of Life and the
Future of Humankind. New York: Anchor Books, 1995.
Lee, Allen, Energy-efficiency Building Codes and Appliance Standards (C&S),
http://www.cadmusgroup.com/wp-content/uploads/2012/11/CS-Article-for-Strategies-
Allen-Lee.pdf.
Leritx, Gilless & Hart, “Strategic Energy Management – It’s Time to Grow Up! A
Maturity Model for SEM Implementation. Proceedings of the 2014 ACEEE Summer
Study on Energy Efficiency in Buildings.
Mahone, Douglas, Nick Hall, Lori Megdal, Ken Keating & Rick Ridge, “Codes and
Standards White Paper on Methods for Estimating Savings,” Prepared for Marian
Brown, SCE in support of Statewide NRNC MA&E, April 7, 2015.
(http://www.cpuc.ca.gov/NR/rdonlyres/6E783BC7-3467-484E-AD2A-
29EF4A50432B/0/Mahone_2005_CS_White_Paper_SavingsEstimatingSavings.pdf&
type=Excerpt&id=606909).
Mass Save, Savings & Evaluation Methodology for Codes and Standards Initiative,
Submitted to Massachusetts Department of Energy Resources on behalf of
Massachusetts Program Administrators, October 20, 2015. (http://ma-
eeac.org/wordpress/wp-content/uploads/Savings-Evaluation-Methodology-for-Codes-
and-Standards-Initiative.pdf)
Mitroff, Ian I. & Richard O. Mason, Creating a Dialectical Social Science, Theory and
Decision Library, Volume 25. Dordrecht, Boston & London: D. Reidel Publishing
Company, 1981.
Morse, William L. & H. Gil Peach, “Control Concepts in Conservation Supply,”
Energy, Vol. 14, No. 11, Pp. 727-735, 1989.
Mass Save, Savings & Evaluation Methodology for Codes and Standards Initiative,
Submitted to Massachusetts Department of Energy Resources on behalf of
Massachusetts Program Administrators, October 20, 2015. (http://ma-
eeac.org/wordpress/wp-content/uploads/Savings-Evaluation-Methodology-for-Codes-
and-Standards-Initiative.pdf).
National Efficiency Screening Project, The National Standard Practice Manual,
Edition 1, Spring 2017, May 18, 2017.
(https://nationalefficiencyscreening.org/national-standard-practice-manual/)
Verification Review of Program Year 2016 Evaluation Results
77 | P a g e
“NEEA’s definition of market transformation,” undated. https://neea.org/docs/default-
source/marketing-tookits/neea_definition_of_markettransformation.pdf?sfvrsn=2.
Nova Scotia’s Action on Climate Change, https://climatechange.novascotia.ca/action-
on-climate-change (accessed 04/10/2017).
NMR Group, Inc., A Review of Effective Practices for the Planning, Design,
Implementation and Evaluation of Market Transformation Efforts. Somerville,
Massachusetts, Nov. 25, 2013, CALMAC Study ID PGE0330.01, Submitted to Pacific
Gas & Electric, San Diego Gas & Electric, Southern California Edison, and Southern
California Gas.
Nutek, Technology Procurement as a Policy Instrument. Stockholm: Swedish
National Board for Industrial and Technical Development, 1995 (R 1995:16).
Opportunities for Action on Energy Management Information Systems for Industrial
Customers: A Report for Program Administrators, NEEA Report E15-292.
Peach, H. Gil, C. Eric Bonnyman, Agneta Persson & Anne West, Pacific Northwest
Energy-Efficient Manufactured Housing: A Time of Paradigm Shift and Transition.
Beaverton, Oregon: H. Gil Peach & Associates, August 1997, a report for the
Bonneville Power Administration.
Peach, H. Gil, Anne Minor West, Howard S. Reichmuth & Pamela Brandis, “Seven
Years After: Impact Evaluation Results Employing Extensive Site Inspection Data
and Associated Pre/Post Billing Analysis,” Pp. 3.97-3.104 in the Proceedings of the
1996 ACEEE Summer Study on Energy Efficiency in Buildings, Panel 3, Residential
Programs: Program Evaluation.
Prahl, Ralph and Ken Keating, Building a Policy Framework to Support Energy
Efficiency Market Transformation in California. California Public Utility Commission,
December 9, 2014, P. 12.
Rogers, E.M., Diffusion of Innovations (5th edition). New York: Free Press, 2003
State and Local Energy Efficiency Action Network, Energy Efficiency Program Impact
Evaluation Guide, 2012. Prepared by Steven R. Schiller, Schiller Consulting, Inc.,
www.seeaction.energy.gov.
New York State Department of Public Service, New York Evaluation Guidelines
http://www3.dps.ny.gov/W/PSCWeb.nsf/96f0fec0b45a3c6485257688006a701a/766a
83dce56eca35852576da006d79a7/$FILE/NY_Eval_Guidance_Aug_2013.pdf.
Verification Review of Program Year 2016 Evaluation Results
78 | P a g e
The MEEA Midwest Regional Codes Conference, papers and reports. See:
(http://www.mwalliance.org/policy/midwest-regional-energy-codes-conference).
Scranton, Roy, Learning to Die in the Anthropocene, Reflections on the End of a
Civilization. San Francisco: City Light Books, 2014.
TecMarket Works, California Energy Efficiency Evaluation Protocols: Technical,
Methodological and Reporting Requirements for Evaluation Professionals. Oregon,
Wisconsin: TecMarket Works, April 2006.
(http://www.calmac.org/events/EvaluatorsProtocols_Final_AdoptedviaRuling_06-19-
2006.pdf).
Tetra Tech, Inc., Process Evaluation of the Energy Management Systems Program—
Colorado. December 17, 2014.
Uniform Methods Project Refrigerator Recycling Evaluation Protocol. See:
(http://www1.eere.energy.gov/wip/pdfs/53827-7.pdf). Prepared by Doug Bruchs and
Josh Keeling, The Cadmus Group, April 2013.
Uniform Methods Project Residential Lighting Evaluation Protocol. See:
(http://energy.gov/sites/prod/files/2015/02/f19/UMPChapter21-residential-lighting-
evaluation-protocol.pdf). P repared by Scott Dimetrosky, Katie Parkinson,
and Noah Lieb, Apex Analytics, LLC, December 2014.
Westling, Hans, Cooperative Procurement: Market Acceptance for Innovative
Energy-Efficient Technologies, Nutek, Stockholm, Sweden, 1996.
Suggested Citation:
Peach, H. Gil, John Mitchell & Mark Thompson & C. Eric Bonnyman, Verification
Review of Program Year 2016 Evaluation Results/Report for the Nova Scotia Utilities
and Review Board. H. Gil Peach & Associates/Scan America®, July 2017.
Suggested Citation:
Peach, H. Gil, John Mitchell, Mark Thompson & C. Eric Bonnyman, Verification
Review of Program Year 2016 Evaluation Results/Report for the Nova Scotia Utilities
and Review Board. H. Gil Peach & Associates/Scan America®, July 2017.