Evaluation Update and

Evaluation Update and
Slide Note
Embed
Share

This document provides an update and draft results of the site-specific savings portfolio evaluation conducted on November 18, 2015 by Lauren Gage from BPA and Michael Baker from SBW Consulting, Inc. It includes insights on the evaluation purpose, achievements, improvement strategies, and how to fully utilize the findings. The evaluation background, policies, strategies, and upcoming plans are also discussed in detail.

  • Evaluation
  • Savings
  • Portfolio
  • Update
  • Strategies

Uploaded on Mar 12, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Evaluation Update and Draft results of the Site-Specific Savings Portfolio November 18, 2015 Prepared by: Lauren Gage, BPA and Michael Baker, SBW Consulting, Inc. Prepared for: Regional Webinar

  2. Why Evaluation? What did we achieve? How do we improve? Objectively, retrospectively Understand why effects documents and measures occurred and identify ways effects of a program in order to improve current and to determine how well it has future programs met its intended outcomes or Constructive and strategic feedback goals. Understand, improve, get new Accurate and reliable achievements measures Transparency and accountability More to come! Effectiveness of spending

  3. Evaluation Update QSSI Policies approved by Sponsors Policies 2016 Strategy finalizing - UES evaluation focus Why Evaluation brownbag Plans Simple Steps Energy Management Pilot Billing analysis testing In Progress Clark OPower Brownbag Site-specific Evaluation Recent Results www.bpa.gov/goto/evaluation 3

  4. Todays Meeting Design Evaluate Implement 4

  5. BPA Evaluation Thoughts Great Results Confirms great work by utility and BPA staff Some areas of improvement Study meets several needs Huge Thanks! Internal team Utility staff SBW/Cadmus team Good process, not perfect Successes: transparency with utilities, multi-functional team, lots of BPA review Improvement needed: duration, communication protocol tracking Next steps important BPA response to recommendations 5 How can we use it fully?

  6. Background Site-Specific Evaluation 6

  7. Evaluation Background Custom and calculator projects All Sectors ~40% of BPA s 2012-2013 achievements Site-Specific Savings portfolio Evaluation on non-res lighting in 2008 Evaluation of Energy Management Pilot in 2012 (new, separate evaluation forthcoming) Little recent evaluation Evaluation Plan : June December 2013 Sample selection and contact (Feb May 2014) Data collection (June 2014 June 2015) Analysis and report preparation (July August 2015) Review and Finalize report (September 2015 now) Timeline 7 7

  8. Objectives Estimate first-year kWh savings for portfolio and 9 domains Estimate the lifecycle cost- effectiveness Identify opportunities for improving processes, M&V practices, evaluation 8

  9. Domains Site-Specific Savings Portfolio Option 1 Option 2 ESRP Lighting Non-lighting Lighting Non-lighting Commercial/ Ag Commercial/ Ag Industrial Industrial Commercial Industrial Commercial Industrial 9

  10. Sample Design 250 200 Sampled Measures 150 100 50 0 Opt 1 Ltg Com/Ag Opt 1 Ltg Ind Opt 1 Non-Ltg Com/Ag Opt 1 Non-Ltg Ind Opt 2 Ltg Com Opt 2 Ltg Ind Opt 2 Non-Ltg Com Opt 2 Non-Ltg Ind ESRP Overall BPA-Funded Measures Utility-Funded Measures Sample represents ~28% of savings, but less than 3% of measures in population. Utilities funded 1/3 of measures 10

  11. More Evaluation Information Supplemental Data Collection Majority of sites needed some data beyond program documentation Phone surveys 93%, on-site 86%, metering 62% Oversample 3 Utilities oversampled Funded 31% of measures in study Allows for separate estimates for their service areas Response Rate Good response from sample (90%) acceptance Little risk of non- response bias 11

  12. Estimating Site-Specific Savings Guiding Principles Treat all measures consistently. Small savers just as important as large savers in stratified random sample Reuse available data. We re-used as much of the program-collected data as we determined to be reliable. Focus on the key determinants and areas with greatest savings. 12

  13. https://encrypted-tbn1.gstatic.com/images?q=tbn:ANd9GcTiBO7jawyNDsifEUNkc72X9LVOBYWncQ4S1ook3dltyMbdLbm6w14-WsZfhttps://encrypted-tbn1.gstatic.com/images?q=tbn:ANd9GcTiBO7jawyNDsifEUNkc72X9LVOBYWncQ4S1ook3dltyMbdLbm6w14-WsZf Process for Estimating Measure Savings 13

  14. Review by BPA and Utilities BPA-Funded Measure Review New Models Site-specific results : BPA reviewed all Non-Lighting and sample of lighting Utility review: Provided results, offered one-on-one discussions but not many occurred Oversample Utility Review Some one-on-one discussions with utility staff 14

  15. What is a Realization Rate (RR) Evaluation Savings Reported Savings = Realization rate Realization rates greater than 1 mean that we found more savings than was reported Realization rates less than 1 mean fewer savings were found 15

  16. Findings Site-Specific Evaluation 16

  17. Overall Results Evaluation savings for the portfolio are nearly the same as the reported savings RR is 0.98 Highs and lows tend to cancel out Combined Domains Measure: Lighting RR is 1.0 and Non-lighting RR is 1.03 Sector: Both commercial and Industrial RRs are 0.98. Option: Option 1 RR is 0.98, Option 2 is 1.08 17

  18. Measure Realization Rates 3.5 3 2.5 Realization Rate 2 1.5 1 0.5 0 0 50 100 150 200 Sampled Measures Opt 1 Ltg Industrial Opt 2 Ltg Commercial Opt 2 Non-Ltg Industrial Opt 1 Ltg Com&Ag Opt 1 Non-Ltg Industrial Opt 2 Non-Ltg Commercial Opt 1 Non-Ltg Com&Ag Opt 2 Ltg Industrial ESRP Quite a bit of scatter by measure Approximately 40% of portfolio has either high or low realization rate; essentially equal high and low 18

  19. Life-Cycle Cost-Effectiveness 4.50 4.00 3.50 3.00 Benefit Cost Ratio 2.50 2.00 1.50 1.00 0.50 0.00 Opt 1 Ltg Com/Ag Opt 1 Ltg Ind Opt 1 Non-Ltg Com/Ag Opt 1 Non-Ltg Ind Opt 2 Ltg Com Opt 2 Ltg Ind Opt 2 Non-Ltg Com Opt 2 Non-Ltg Ind ESRP Overall TRC TRC, no NEBs All domains and portfolio are cost-effective (TRC 2.65) Non-Electric Benefits increase TRC by 6% (from 2.49 to 2.65) 19

  20. Lighting Savings Overall, lighting RR is 1.0 Offsetting factors between Option 1 and Option 2 Option 1 RR is 0.93 Option 2 Lighting RR is 1.08 20

  21. Lighting Realization Rates Option 1 Option 2 Mostly low RRs More scatter, more high RRs 21

  22. Option 1 Lighting Evaluation found 6.8% less savings All factors less than 5% difference in savings: Metering found ~4% fewer hours of operation Other changes - HVAC system type and fixture type/count reduced savings by ~2% 22

  23. Option 2 Lighting Evaluation found 7.9% more savings Utility embedded largest factor (5% increase) Small other changes Metering found ~3% fewer hours of operation Change to BPA calculator increased savings by 2% 23

  24. Non-Lighting Savings Overall, Non- Lighting RR is 1.03 Evaluation found more savings for both Option 1 and Option 2 Option 1 RR is 1.02 Option 2 Lighting RR is 1.07 24

  25. Non-Lighting Savings Evaluation found 10%+ more savings for Opt 1 and 2 Commercial Option 2 embedded RR : under-reporting of savings Commercial 7% Industrial 5% 25

  26. Non-Lighting Realization Rates Option 1 has less scatter than Option 2 26

  27. Energy Smart Reserve Projects (ESRP) Realization Rates 1 All RRs are below 1 0.8 Three measures have RR below 0.5 and one RR is negative 0.6 Realization Rate 0.4 Issues: 0.2 Multiple projects: downstream reuse saved water assumptions 0 -0.2 One project with atypical first-year operation Sampled Measures ESRP 0% RR > 1.2 60% RR < 0.8 27

  28. Adherence to Protocols and Guidelines 28

  29. Compliance with Protocol Selection Guideline 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Opt 1 Non-Ltg Com/Ag Opt 1 Non-Ltg Ind Opt 2 Non-Ltg Com Opt 2 Non-Ltg Ind ESRP Overall Option 1 has high compliance Option 2 lower compliance Compliance may not predict RR Commercial More than half not compliant 29

  30. Other Findings IM Documentation Requirements Substantial number of invoices missing Option 1: Lighting (~50%), Non-lighting (~25%) Some Option 1 Com/Ag completion workbooks missing Some Option 2 M&V plans missing (~17%) Other documentation Some Working models missing, especially for Option 2 (~25%) TAP Assignment Option 1 Lighting calculator does not use TAP codes Option 2 has high rate of misclassification for Lighting and Non- Lighting (~48%) 30

  31. Other Products COTR Oversight Site-specific results provided to BPA COTRs for oversight purposes Lighting Metering Data Data to inform RTF lighting standard protocol Commercial momentum savings and HVAC interaction factors 31

  32. Recommendations 32

  33. Increasing Reliability of M&V Savings Estimates Avoid Embedded Realization Rates Clarify M&V Protocols Improve QC for ESRP projects Improve Lighting Calculators Avoid or Improve Simplified Saving Calculators 33

  34. Improving Program Documentation Investigate Opportunities for Reducing Reporting Burden Require Working Models Obtain and Store Contractor Invoices Improve Document Organization and Version Control Document: Project Specs, Milestones, M&V Protocol, Project Engineer Improve TAP Coding 34

  35. Improving Future Evaluations Align evaluation protocols with M&V protocols Improve tracking of utility and end-user contact Ensure all site-specific projects are included in evaluation Consider faster or real-time evaluation Require and simplify end user contact 35

  36. Next Steps Design Evaluate Implement 36

  37. Questions? Report and highlights: www.bpa.gov/goto/evaluation Lauren S.M. Gage lsmgage@bpa.gov 37

More Related Content