Federal Government Brags about Being Average

I recently ran across a post on the Energy Information Agency (EIA’s) web sit highlighting the fact that from 2003 to 2012 Federal buildings had achieved a greater decrease in energy use intensity than had been achieved by commercial buildings, on average. I find this spin to be offensive on various levels.

The relevant graph is shown below.

2016-09-16-government-building-eui

The first thing to note is that, even with this decrease in energy use Federal buildings still have higher EUI than do other commercial buildings (compare the red and blue 2012 bars).

Second, while I am pleased that the Federal government is learning how to operate its buildings almost as well as the rest of the commercial building sector, it is not a remarkable accomplishment.  It reminds me of a verbal exchange between then Governor Bill Clinton and businessman Ross Perot during a presidential debate.  Clinton was bragging that under his leadership the State of Arkansas had improved its rank among other states in education from almost last up to the middle of the pack.  Ross Perot pointed out that you don’t have to be innovative when you are ranked last — you will move up by just copying what others have done.  (I will confess, this is my memory of what happened, but it might be that I imagined this exchange — it is a good story, in any case.)

Third, why is the EIA engaging in such spin?  This agency is supposed to gather and disseminate energy facts.  Spin should be left to political parties.

 

The Illusions of EUI in Calculating Energy Savings

In the last month I have found the time to begin looking at the 2012 CBECS data released by the EIA last May.

Today I am writing about something I just learned concerning U.S. Worship Facilities.  Here I am looking at the subset of Worship Facilities that meet the criteria stated by the EPA for performing their multivariate regression for the Worship Facility ENERGY STAR model (about 80% of all U.S. Worship Facilities).

In comparing the 2012 and the 2003 CBECS data for Worship Facilities we see there was an estimated 2% increase in the number of these buildings.  As there is an 8-9% uncertainty in the estimated number of these facilities, this increase is  not statistically significant.  The EIA data show that the mean site energy use intensity (EUI) for these facilities actually went down by 15% from 48 to 41 kBtu/sf — and this reduction is statistically significant as it exceeds the 6-8% uncertainty in these figures.  No doubt some government agency will use this reduction to claim success in programs to promote energy efficiency.

But nature is not impressed because total energy used by these buildings actually went up.  The reason — the buildings are, on average, getting bigger!  From 2003 to 2012 the total gross square footage contained in this filtered subset of Worship Facilities increased from 3.2 to 3.8 billion sf, a whopping 23%.  Thus the total site energy used by Worship Facilities grew by 5%.  A similar conclusion can be made for source energy, even with the improved efficiency of the electric power sector over this last decade.

It should be noted that statistics show that the number of Americans who actually go to church declined by about 7% from 2007-2014.  So in a decade when religious worship is decreasing the amount of energy used by Worship Facilities has grown by about 5%.

Bottom line — don’t be fooled by decreases in building EUI.  It is total energy that matters.

New CBECS Data confirm EPA’s K-12 School ENERGY STAR score is nonsense

As I have written before — indeed, the subject of my recent book — my work shows that the EPA’s ENERGY STAR benchmarking scores for most building types are little more than placebos.  The signature feature of the ENERGY STAR benchmarking scores is the assumption that the EPA can adjust for external factors that impact building energy use.  This adjustment is based on linear regression performed on a relatively small dataset.  For most building types this regression dataset was extracted from the Energy Information Administration’s 2003 Commercial Building Energy Consumption Survey (CBECS).  The EPA has never demonstrated that these regressions accurately predict a component of the energy use of the larger building stock.  They simply perform their regression and assume it is accurate in predicting EUI for other similar buidings.

In the last three years I have challenged this assumption by testing whether the EPA regression accurately predicts energy use for buildings in a second, equivalent dataset taken from the earlier, 1999 CBECS.  In general I find these predictions to be invalid.    For one type of building — Supermarkets/Grocery Stores — I find the EPA’s predictions to be no better than those of randomly generated numbers!

In May of this year the EIA released public data for its 2012 Commercial Building Energy Consumption Survey.  These new data provide yet another opportunity to test the EPA’s predictions for nine different kinds of of buildings.  These new data will either validate the EPA’s regression models or confirm my earlier conclusion that they are invalid. Over the next year I will be extracting 2012 CBECS data to again test the nine ENERGY STAR benchmarking models based on CBECS data.

This week I performed the first of these tests for K-12 Schools.  539 records were extracted from the CBECS 2012 data for K-12 Schools representing 230,000 schools totalling 9.2 billion gsf.  After filtering these records based on EPA criteria, 431 records remain, representing a total of 137,000 schools with 8.0 billion gsf.

I performed the EPA’s weighted regression for K-12 Schools on this final dataset and obtained result totally inconsistent with those obtained by the EPA using CBEC 2003 data. Only 3 of the 11 variables identified by the EPA as “significant predictors” of building Source EUI for K-12 Schools demonstrated statistical significance with the 2012 data. Numerous other comparisons confirmed that the EPA’s regression demonstrated no validity with this new dataset.

The EPA will no doubt suggest that their model was valid for the 2003 building stock, but not for the 2012 stock — because the stock has changed so much in the intervening 9 years! While this seems plausible, this explanation does not hold water.  First, CBECS 2012 data do not suggest significant change in either the size or energy use of the K-12 School stock.  Moreover, this explanation cannot also explain why the EPA regression was not valid for the 1999 building stock — unless the EPA is to suggest that the stock changes so much in just 4 years to render the regression invalid.  And if that is the EPA position — then why would they even attempt to roll out new ENERGY STAR regression models for K-12 Schools based on 2012 CBECS data more than 4 years after these data were valid?  You can’t have it both ways.  Either the stock changes rather slowly and a 4 year delay is not important or this benchmarking methodology is doomed to be irrelevant from the start.

 

The more plausible explanation — supported by my study — is that the EPA’s regression is simply based on insufficient data and is not valid — even for the 2003 building stock.  I suggest a regression on a second, equivalent sample from the 2003 stock would yield results that differ from the EPA”s origina regression.  The EPA’s ENERGY STAR scores have not more validity than sugar pills.

 

“Building ENERGY STAR scores – good idea, bad science” book release

After more than three years in the making I have finally published my book, Building ENERGY STAR scores — good idea, bad science.  This book is a critical analysis of the science that underpins the EPA’s building ENERGY STAR benchmarking score.  The book can be purchased through Amazon.com.  It is also available as a free download at this web site.

rotated

I first began looking closely at the science behind ENERGY STAR scores in late 2012. The issue had arisen in connection with my investigation of energy performance of LEED-certified office buildings in New York City using 2011 energy benchmarking data published by the Mayor’s office.  My study, published in Energy & Buildings, concluded that large (over 50,000 sf) LEED-certified office buildings in NYC used the same amount of energy as did conventional office buildings — no more, no less.  But the LEED-certified office buildings, on average, had ENERGY STAR scores about 10 points higher than did the conventional buildings.  This puzzled me.

So I dug into the technical methodology employed by the EPA for calculating these ENERGY STAR scores.  I began by looking at the score for Office buildings.  Soon thereafter I investigated Senior Care Facilities.  Over the next three years I would dig into the details of ENERGY STAR models for 13 different kinds of buildings. Some preliminary findings were published in the 2014 ACEEE Summer Study on Energy Efficiency in Buildings.  A year later I would present a second paper on this topic at the 2015 International Energy Program Evaluation Conference (IEPEC)  Both of these papers were very limited in scope and simply did not allow the space necessary to include the detailed analysis.  So I decided to write a book that contained a separate chapter devoted to each of the 13-types of buildings.  In time the book grew to 18 chapters and an appendix.

This book is not for the general audience — it is highly technical.  In the future I plan to write various essays for a more general audience that do not contain the technical details. Those interested can turn to this book for the details.

As mentioned above the printed copy of the book is available through Amazon.com. Anyone interested in an electronic copy should send me a request via email with their contact information. Alternately an electronic copy may be downloaded from this web site.

Incidently, the book is priced as low as possible — I do not receive 1 cent of royalty.  The cost is driven by the choice of large paper and color printing — it was just going to be too much work to re-do all the graphs so that they were discernable in black and white!

 

 

NYC Energy Benchmarking Report Over-estimates Energy Savings

The Mayor’s Office in New York City has recently released their annual report looking at the 2013 energy data for commercial buildings.  This is the fourth such report.  Each annual report appears to take longer and longer to prepare suggesting it is easier to gather energy data than to analyze and understand it.

The lead line in this report is that those preparing the report conclude that over a four-year period (2010-2013) green house gases associated with NYC building energy has decreased by 8% and energy use by buildings has decreased by 6%.  They cannot resist suggesting that NYC’s energy benchmarking program can take credit for this reduction.

My analysis of these data show the savings is only half this amount. The other half of the claimed savings is an artifact of the EPA’s having lowered its national, site-to-source energy conversion factor for electricity in Summer 2013.  The same mistake was made by the Washington DC Department of the Environment a year ago.

NYC does not live in a vacuum.  Over the last 10 years expanded use of natural gas and retirement of coal plants has cleaned up the entire U.S. electric grid — of which NYC is a part.  In fact, the purchase of fracked natural gas from Pennsylvania (fracking is outlawed in NY State) is the primary driver of reduced green house gas emission in NYC.  It has little to do with NYC building policies!

The NYC analysis apparently comes from adding up the annual greenhouse gas emission and weatherized source energy use of some 3,000 properties that submitted benchmarking data for all four years.  Using 2010 figures as a baseline the relative annual reductions for these selected properties are graphed below.

report figure 1

Here I want to focus on the source energy curve.  As compared with 2010, energy use went up slightly in 2011, then dropped by nearly 4% in 2012 and another 2.5% in 2013.  The drop in 2012 is easy to understand — hurricane Sandy brought the City to a grinding halt affecting tourism and many operations.  This reduction in energy use should be viewed with great skepticism.  But the continued reduction into 2013 seems like a sign of increased energy efficiency.  Or does it?

Until 2013 the EPA used a site-to-source energy conversion factor for electric energy of 3.34.  In summer 2013 the EPA adjusted this number by 6% to 3.14.  When it generated the 2013 report for NYC it used this reduced site-to-source energy conversion factor.  In other words, the 2013 reduction in NYC’s weather normalized source energy has little to do with building operation and everything to do with the EPA adjusting source energy down for the entire nation!  And this reduction does not reflect the single year improvement in the electric grid.  The EPA made no adjustment to this factor for many years prior to 2013, then in 2013 made a one-time-adjustment to reflect a 5-year average.

The NYC report is based on confidential data — no public benchmarking data were released for 2010.  Nevertheless, I can mimic the analysis by looking only at public NYC benchmarking data for 2011, 2012, and 2013.  In these data I find about 1200 buildings that reported energy data for each of these three years. About 1000 buildings remain after removing any that have questionable data for any of these three years (i.e., site EUI >1000 or <10 kBtu/sf).  The total weather normalized source energy for these buildings is graphed in blue below for each of the three years.  This graph mirrors the trend displayed in the NYC report.  The total site energy for these buildings is graphed in red.  The change in site energy matches the change in source energy for 2012 but not for 2013.  This confirms what I have explained above — that the EPA’s changing site-to-source energy conversion factor for 2013 is responsible for most of the change.  The graph below shows that 2013 site energy was actually higher than 2012 site energy.  It did not go down at all.

relative-energy-savings-scofield

The simple fact is that over the three year period shown below the site energy use of these 1000 buildings went down by only 3.5% — a figure which is highly uncertain given the sample size.  The 6% energy savings claimed by the Mayor’s Office is obtained through faulty analysis.

 

What does it mean to say “the Hotel at Oberlin is solar powered?”

The Hotel at Oberlin, also called the Peter Lewis Gateway Hotel, opened two weeks ago just in time for Oberlin College graduation.  This hotel replaces the Oberlin Inn and has been put forward as the corner stone of what will become a sustainable block of buildings — called the green arts district.  As the budget for this building continues to swell it is unlikely the College will make further headway on the “green arts district” for some time to come.

In multiple venues (Oberlin Alumni Magazine, Cleveland City Club, Cleveland Plain Dealer, etc.) the Special Assistant to the Oberlin College President on Sustainability and the Environment, has described the Hotel at Oberlin as “100% solar powered.”  Here I address the credibility of this claim.  I find the claim to be lacking in substance, yet very costly to the College.

The Hotel at Oberlin will use both electricity and natural gas.  100% of its electricity will be purchased from the local utility, Oberlin Municipal Light & Power Systems (OMLPS), as is the case for nearly all Oberlin College buildings.  In addition, the hotel will use natural gas to produce all of its hot water and, if necessary, for additional winter heating should its ground source heat pumps be unable to meet the demand.  This is a likely situation since the hotel, which is eight times the size of the Lewis Environmental Center, has a ground-well field that is less than four times the size of that building’s well field.  The building includes no on-site renewable power generation, whatsoever.  Based on equipment size the utility estimates a 1,000,000 kWh increase in annual electric use.  That means the new, “energy efficient” hotel will use nearly 2,000,000 kWh of electric energy — more than double that used by the Oberlin Inn it replaces.

What then, could be the basis of the solar power claim?  The President’s Office would have people believe the solar energy for the hotel is coming from the 2.2 MW photovoltaic (PV) array constructed four years ago north of the athletic fields, the so-called OSSO array.  Apparently the College is trying to convince the US Green Building Council (USGBC) that this array provides “on-site renewable energy” to the hotel – worth as many as 8 points towards its coveted LEED certification.

But what is on-site solar electricity?  On-site solar, such as that provided by the two photovoltaic (PV) arrays at the Adam Joseph Lewis Center, furnish electric power directly to a building, avoiding the transmission losses that occur when power passes through multiple high-voltage transformers and transmission lines.  On-site solar generation, added to an existing building,  lowers the building’s fossil energy and carbon footprint.  And, by avoiding transmission losses, the benefits of on-site solar are greater than those achievable through off-site renewable sources.  It should be noted that the converse – adding a building to an existing solar array – increases total greenhouse gas emission!

It is not possible for the OSSO PV array to provide on-site electricity to the Hotel at Oberlin.  First, it is located a mile away from the Hotel — not exactly “on-site.”  Second, the College entered into the OSSO project long before the hotel was conceived. When the OSSO array was constructed in 2012 the College chose to connect it directly to the OMLPS electric grid.  Transmission losses are not avoided.  Third, the City takes all of the array’s electric energy and, in turn, pays the College a premium rate (above the City’s average wholesale generation cost) of $0.085 per kWh.  This arrangement has zero impact on electric sales to College buildings – each building continues to purchase retail electric energy from OMLPS as if the array did not exist.  The City sends the College a monthly check in exchange for this energy which, to date, total more than $800,000.  OMLPS includes the OSSO PV array in its power portfolio.  Once electrons enter the OMLPS grid they go everywhere; they are not “special electrons” that only go to the Hotel or other College buildings.

And finally, even if the College now chose to construct a dedicated, mile-long cable to connect the OSSO array to the Hotel it would be of no use because the College signed a 25-year contract to deliver 100% of the array’s energy to the City in exchange for $85/MWh.  Off-site renewable energy is a good thing, too.  A building can obtain off-site energy by purchasing Renewable Energy Credits or RECs.  The USGBC provides up to 3 points towards LEED certification if a building uses RECs to offset a large fraction of its electric use.  In principle, the RECs produced by the OSSO array make the Hotel at Oberlin eligible for these points.  In fact, OMLPS already holds RECs (mostly wind and hydro) to cover about 85% of its electricity.  That means that any building purchasing energy from the OMLPS grid can claim RECs for 85% of its electric energy.

The College receives all of the RECs associated with OSSO’s energy and does not sell these to the City.  The day OSSO went on-line these RECs made Oberlin College a greener place – and that is a good thing!  The credit is entirely due to the OSSO array.  In principal these RECs may be “assigned” to any College building.  They could, for instance, be assigned to Finney Chapel, built more than 100 years ago.  Does this assignment now make Finney Chapel “100% solar powered?”  Perhaps – but Finney Chapel remains the same energy hog it has always been.  And no one would be fooled by this association into believing that Finney Chapel is now worthy of architectural design awards.  Assigning these RECs to Finney Chapel does not make Oberlin College any greener than it was in 2012 the day the OSSO array began producing its green energy.

And so calling the Gateway Hotel “solar powered” tells us nothing about the hotel or its design; it is nothing more than a cheap marketing trick.

But, as it turns out, it isn’t cheap at all – it is an expensive marketing trick.

The financial model that justified the OSSO array called for the College to sell these solar RECs into Ohio’s REC market and replace them with cheaper wind RECs — adopting a strategy similar to that used by the City of Oberlin to generate its now famous REC revenue.  Put simply, the College would sell its solar RECs into the Ohio REC market at a high price (perhaps $50/MWh) and replace them with cheaper wind RECs (perhaps $5/MWh).  This strategy provided the College with more than $200,000 additional revenue during OSSO’s first two years of operation.

But after the first two years the College stopped selling its solar RECs  – foregoing tens of thousands of dollars in revenue.  During this time Ohio REC prices dropped significantly.  Yet even today solar RECs generated by OSSO have an estimated annual value of $45,000.

Perhaps this lost revenue represents incompetence of the Oberlin College Finance Office.  In addition, this office has remained silent while the City debates whether to return REC revenues to electric customers – of which the College portion would be $200,000 per year!  These are strange financial decisions at a time when the College is desperately seeking to close a huge budget deficit and threatening to downsize its work force.

I believe Oberlin College’s decision not to sell RECs is more calculated.  I believe the decision not to sell OSSO’s solar RECs was made to bolster the narrative that this array provides on-site solar energy to the Hotel at Oberlin.  In 2014 when designers of the Hotel at Oberlin came up with this scheme — it was too late.  The College had already connected the array directly to the City grid, entered a 25-year contract with the City, and it had already sold off two years of its solar RECs.  But why let facts get in the way?  I believe that the President’s Office decided to stop selling the RECs and pushed the USGBC to accept the idea that the OSSO array provides on-site solar energy to the Hotel at Oberlin — facts be damned!

What is the cost of this decision?  It appears the College failed to honor the third year of its contract to sell solar RECs at $50/MWh.  No doubt the purchasing party in that contract did not object — since the market value for these RECs have fallen to $15/MWh.  The array is expected to produce 3,000 MWh per year.  The lost revenue from REC sales for 2015 is probably $100,000, and at current REC prices, continued failure to sell these RECs represents an additional $30,000 per year in lost net-revenue.  (Note that out-of-state wind RECs purchased to replace solar RECs cost $5/MWh.)

LEED certification is known to add to the cost of design and construction.  But in this case the College is looking to pay an annual fee of $20,000 (in lost REC revenue) to buy 5 LEED points towards its Hotel certification!  (The Hotel’s estimated electric use is 2/3 the amount produced by the OSSO array).  Is LEED certification really worth such an ongoing expense — a kind of franchise fee?

The more disturbing question in all this has to do with the fiscal responsibility of these kinds of decisions.  It is pretty clear that the President of Oberlin College pays more attention to his Special Assistant on Sustainability and the Environment than he does to his own V.P. of Finance.  How long will the Oberlin College Board of Trustees allow this insanity to go on?

San Francisco PUC Building not so green

SFPUC photoThe San Francisco Public Utilities Commission Administration building, constructed in 2012, has been billed as the greenest office building in North America.  Yesterday the San Francisco Examiner published an article which suggests the declaration was a bit premature.   According to its author, Joshua Sabatini, the $202 million dollar, LEED Platinum building has not performed up to expectations.  The building included integrated photovoltaic panels and wind turbines — enough to provide 7% of the building’s energy (not sure if that is total energy or just electric energy).  The energy produced by the wind turbines was never metered and the wind turbines have already been decommissioned; the company that installed them has filed for bankruptcy.  While the PV panels are reported to have satisfactory performance the inverter room was over-heating, requiring the installation of an auxiliary cooling system.  We will have to take the SFPUC’s word for this result as nowhere can I locate specific information about the expected PV electric generation.  It is so much easier to control the story when you don’t share the facts.

But Sabatini’s article does not discuss the energy performance of this building which is also rather disappointing.  According to 2014 energy benchmarking data published by San Francisco for municipal buildings the 277,511 sf SFPUC building had a measured site EUI of 54 kBtu/sf, just 10% lower than the mean for SF office buildings (60 kBtu/sf).  This is hardly the 32% energy savings claimed on the sfwater.org web site.  Moreover, the source EUI for this building is 153 kBtu/sf, which is 10% higher than the mean for the other 38 municipal office buildings whose 2014 energy data were disclosed.  This “greenest office building in North America” uses 10% more primary energy than used for other municipal office buildings — most of them constructed many years ago.

In other words this LEED Platinum building, the greenest office building in North America, uses 10% more primary energy than its counterparts in the San Francisco municipal building stock.  Sounds like a real winner.

Previously in 2009 I found that LEED-certified office buildings demonstrated modest (about 10%) site energy savings but, owing to their greater reliance on electric energy, demonstrated no significant source energy savings.  The result for the SFPUC building is even worse.

2012 CBECS show building energy use up from 2003

Last week the U.S. Energy Information Administration (EIA) released summary energy use data from its 2012 Commercial Building Energy Consumption Survey (CBECS).  The EIA reports that, as compared with 2003 results, the energy use intensity (EUI) for all U.S. commercial buildings has decreased by 12%.  They also report that for office buildings and educational buildings EUI have decreased by 16% and 17%, respectively.  These numbers, taken at face value, would appear to be encouraging.

But dig a little deeper and you find there is not much to celebrate.  The first thing to note is that mother nature does not care about energy use intensity.  This is a man-made metric for comparing energy use between buildings of different size. What really matters is total green house gas emission and total fossil fuel consumption.  To arrest global climate change, or at least to stabilize it, will require a global reduction in annual green house gas emission.

The 2012 CBECS data show that the total gross square footage (gsf) of the U.S. commercial building stock has expanded by 21% since 2003.  Its total (site) energy consumption has expanded by 7%.  That’s right — U.S. buildings are using more (not less) energy.  During this same time the U.S. population grew by 7.6%.  If world energy consumption and green house gas emission continues to grow with world population we are doomed!  Energy use in undeveloped countries will grow much faster than population as they increase their standard of living.  This growth is especially notable in India and China.  Developed countries like the US — which already use 5X-10X more energy per capita than non-developed countries — must decrease their energy consumption and green house gas emission.  Yet the U.S. is not even holding steady.

The above figures are based on site energy — not primary or source energy which is what really matters.  Building source energy — which includes the off-site energy use associated with energy generation and transportation — is a better indicator of primary energy consumed by buildings.

I have made crude source energy calculations based on the 2012 CBECS summary data and find that for all U.S. commercial buildings source EUI decreases by only 7% and source EUI for offices and educational buildings decreased by 12 and 13% respectively.

But again, what matters is total primary, or equivalently, source energy consumption.  When you combine these figures with the 21% growth in building gsf you find that the total source energy for all buildings increased by 13% — faster than the rate of population growth!  For offices and educational buildings the increases in source energy were 15 and 8%, respectively.  For offices that is double the rate of U.S. population growth and for educational buildings it is about the same as population growth.

2003 to 2012 is the decade of ENERGY STAR and LEED building certification.  These programs both provide cover for building owners to “feel good” about their ever-growing buildings that consume more energy and produce more green-house gas emission — yet are judged to be “green” and “energy-efficient.”  Proponents of these programs will claim that, while their accomplishments are disappointing, things would be far worse if these programs and their goals did not exist.  I doubt the truth of this assertion.  There is no evidence that ENERGY STAR and LEED-certified buildings are performing any better, on average, than other commercial buildings.  These programs are pretty much a distraction from the important societal goals to reduce green house gas emission.

The 2012 CBECS data also put the EPA’s claims that ENERGY STAR benchmarking is saving energy into perspective.  In 2012 the EPA published marketing literature which claimed that 35,000 buildings that used Portfolio Manager to benchmark for the consecutive years 2008, 2009,, 2010, and 2011 demonstrated a 7% reduction in source EUI over this same time period.  The analysis is sophomoric because they literally average the EUI for these 35,000 buildings rather than calculate their total gross source EUI (as does CBECS) which is the sum of all their source energy divided by the sum of their gsf.  It is entirely possible that the gross EUI for these buildings did not decrease at all while their average showed 7% reduction. The 35,000 buildings in the EPA study is dominated by office buildings — by far the largest set of buildings that use their benchmarking software.  Hence their claim of 7% reduction in source energy over the three year period must be seen in a context in which all U.S. office buildings saw a reduction in source EUI of 12% over a 9 year period.  There is simply little reason to believe that buildings that benchmark perform any better than those that don’t.

Once again real energy performance data cast doubt on energy savings claims for U.S. buildings.

 

EPA makes a mockery of Freedom of Information Act

In my attempt to understand the EPA’s methodology for calculating ENERGY STAR building benchmarking scores I have frequently requested specific information from the EPA.  Early on I found the EPA to be reluctant to share anything with me that was not already publicly released by the agency.  Dissatisfied with this lack of transparency I decided to formally request information from the EPA through the Freedom of Information Act (FOIA) process.  I filed my first FOIA request in March of 2013.  I have since filed about 30 such requests.

The Freedom of Information Act requires that a Federal agency respond to such requests within 20 working days.  If the Agency fails to comply you can file a law suit with the Federal Courts and be virtually guaranteed a summary judgement ordering the Agency to release the requested documents.  Of course the courts move at a snail’s pace so you cannot expect this process to produce documents anytime soon, or even to get the courts to take action in a rapid time frame.

The EPA keeps track of its statistics at addressing FOIA requests.  It has devised two tracks for such requests, a Simple Track and a Complex Track in which requests are sorted.  EPA policy is to make every attempt to respond to simple FOIA requests within the statutory 20 day time frame.  Complex FOIA requests take longer time to locate documents and process them for public release.  For instance, if you request all of Hillary Clinton’s emails it will take time to locate them and to eliminate any portions that might be classified.

The EPA has also adopted a first in-first out policy for processing FOIA requests from a particular requester.  So, if I already have a complex FOIA request in the queue and I file a second, complex FOIA request, it is the EPA’s policy to complete processing of the first request before turning to the second request.  The same policy applies to any requests in the Simple Track.  But it is EPA policy to treat these two tracks independently.  Meaning that if I have a pending FOIA request in the Complex Track queue and subsequently file a Simple FOIA request, the EPA’s policy is to work on these two requests in parallel.  That is, it will not hold up a Simple FOIA request in order to complete a Complex request that was filed earlier.

I have a Complex FOIA request with the EPA that has been outstanding for nearly two years.  I have no expectation that the EPA will respond to this request unless I seek assistance from the courts.  They are simply intransigent.  This action, combined with the EPA’s first in-first out policy means that the EPA will not process any other complex FOIA requests from me unless I get the courts involved.

On August 9, 2015 I filed a FOIA request with the EPA to provide me with copies of 11 documents that summarize the development and revision of the EPA’s Senior Care Facility ENERGY STAR building model.  I know these documents exist because earlier I received an EPA document entitled, “ENERGY STAR Senior Care Energy Performance Scale Development,” an EPA document that serves as a Table of Contents for documents associated with the development of this model.  This request requires no searching as the requested documents are specifically identified, readily available, and cannot possibly raise national security issues.  Yet the EPA placed this request in its “Complex Track” and provided no response to me for more than 20 days.

On September 14, 2015, having received no response I filed what is called an “Administrative Appeal” to ask the Office of General Counsel to intercede to force the agency to produce the requested documents.  In my appeal I pointed out that my FOIA request was, by very definition, simple, and thus EPA policy required the Agency to act on this request within the 20 day statuatory period.  By Law the EPA has 20 working days to decide an Administrative Appeal.

On Friday, October 30, 2015 the EPA rendered a ruling on my Administrative Appeal.  The ruling is simple — the Office of General Counsel directs the Agency, within 20 working days, to respond to my initial request.  Think of it, 58 working days (two and a half months) after I filed my initial FOIA request — a request which by law should have been responded to within 20 working days, the EPA has now been told by the Office of General Counsel to respond to my request within 20 working days.  What a farse!

 

 

 

 

ENERGY STAR building models fail validation tests

Last month I presented results of research demonstrating that regressions used by the EPA in 6 of the 9 ENERGY STAR building models based on CBECS data are not reproducible in the larger building stock.  What this means is that ENERGY STAR scores built on these regressions are little more than ad hoc scores that have no physical significance.  By that I mean the EPA’s 1-100 building benchmarking score ranks a building’s energy efficiency using the EPA’s current rules, rules which are arbitrary and unrelated to any important performance trends found in the U.S. Commercial building stock.  Below you will find links to my paper as well as power point slides/audio of my presentation.

This last year my student, Gabriel Richman, and I have been devising methods using the R-statistics package to test the validity of the multivariate regressions used by the EPA for their ENERGY STAR building models.  We developed computer programs to internally test the validity of regressions for 13 building models and to externally test the validity of 9 building models.  The results of our external validation tests were presented at the 2015 International Energy Program Evaluation Conference, August 11-13 in Long Beach, CA.  My paper, “Results of validation tests applied to seven ENERGY STAR building models” is available online.  The slides for this presentation may be downloaded and the presentation (audio and slides) may be viewed online.

The basic premise is this.  Anyone can perform a multivariate linear regression on a data set and demonstrate that certain independent variables serve as statistically-significant predictors of a dependent variable which, in the case of EPA building models, is the annual source energy use intensity or EUI.  The point in such regressions, however, is not to predict EUI for buildings within this data set — the point is to use the regression to predict EUI for other buildings outside the data set.  This is, of course, how the EPA uses its regression models — to score thousands of buildings based on a regression performed on a relatively small subset of buildings.

In general there is no a priori reason to believe that such a regression has any predictive value outside the original data on which it is based.  Typically one argues that the data used for the regression are representative of a larger population and therefore it is plausible that the trends uncovered by the regression must also be present in that larger population.  But this is simply an untested hypothesis.  The predictive power must be demonstrated through validation.  External validation involves finding a second representative data set, independent from the one used to perform the regression, and to demonstrate the accuracy of the original regression in predicting EUI for buildings in this second data set.  This is often hard to do because one does not have access to a second, equivalent data set.

Because the EIA’s Commercial Building Energy Consumption Survey (CBECS) is not simply a one-time survey, there are other vintages of this survey to supply a second data set for external validation.  This is what allowed us to perform external validation for the 9 building models that are based on CBECS data.  Results of external validation tests for the two older models were presented at the 2014 ACEEE Summer Study on Energy Use in Buildings and were discussed in a previous blog post.  Tests for the 7 additional models are the subject of today’s post and my recent IEPEC paper.

If the EUI predicted by the EPA’s regressions are real and reproducible then we would expect that a regression performed on the second data set would yield similar results — that is, similar regression coefficients, similar statistical significance for the independent variables, and would predict similar EUI values when applied to the same buildings (i.e., as compared with the EPA regression).  Let the EPA data set be data set A and let our second, equivalent data set be data set B.  We will use the regression on data set A to predict EUI for all the buildings in the combined data se, A+B.  Call these predictions pA.  Now we use the regression on data set B to predict EUI for all these same buildings (data sets A+B) and call these pB.  We expect pA = pB for all buildings, or nearly so, anyway.  A graph of pB vs pA should be a straight line demonstrating strong correlation.

Below is such a graph for the EPA’s Worship Facility model.  What we see is there is essentially no similarity between these two predictions, demonstrating the predictions have little validity.

Worship pBvspA

This “predicted EUI” is at the heart of the ENERGY STAR score methodology.  Without this the ENERGY STAR score would simply be ranking buildings entirely on their source EUI.  But the predicted EUI adjusts the rankings based on operating parameters — so that a building that uses above average energy may still be judged more efficient than average if it has above average operating characteristics (long hours, high worker density, etc.).

What my research shows is this predicted EUI is not a well-defined number, but instead, depends entirely on the subset of buildings used for the regression.  Trends found in one set of buildings are not reproduced in another equally valid set of similar buildings.  The process is analogous to using past stock market values to predict future values.  You can use all the statistical tools available and argue that your regression is valid — yet when you test these models you find they are no better at picking stock winners than are monkeys.

Above I have shown the results for one building type, Worship Facilities.  Similar graphs are obtained when this validation test is performed for Warehouses, K-12 Schools, and Supermarkets.  My earlier work demonstrated that Medical Office and Residence Hall/Dormitories also failed validation tests.  Only the Office model demonstrates strong correlation between the two predicted values pA and pB — and this is only when you remove Banks from the data set.

The release of 2012 CBECS data will provide yet another opportunity to externally validate these 9 ENERGY STAR building models.  I fully expect to find that the models simply have no predictive power with the 2012 CBECS data.