Colorado used to be counted among national leaders on climate change and renewable energy, with citizens voting in favor of a 2004 initiative to establish a renewable energy standard for qualifying utilities. These standards were then increased multiple times by the Colorado Legislature to their current level, 30% renewable generation for investor-owned utilities by 2030 and 20% for large electric co-operatives.
Now, as governors from three states – California, New York and Colorado – release plans or sign legislation related to climate change and renewable energy, it is clear that Colorado no longer leads on these issues.
Jerry Brown, California
Yesterday, California Governor Jerry Brown signed SB 350, landmark legislation to reduce air pollution and increase renewable energy. This legislation requires his state to:
Generate 50% of its electricity from renewable sources by 2030
Double energy efficiency of homes, offices and factories
Incentivize utilities to install electric charging stations
Authorize the California grid operator, CAISO, to transform itself into a regional energy market, potentially spurring renewable energy development across the West
Joined California in signing the Under 2 MOU (Memorandum of Understanding), a compact between states, provinces and cities around the world committing them to limit emissions in line with a 2 degrees Celsius increase in global average temperature, vowing to reduce emissions between 80% and 95% below 1990 levels by 2050 (and/or below 2 metric tons per capita annually by 2050
Declared his intention to link the Regional Greenhouse Gas Initiative, a cap and trade market reducing emissions in nine Northeastern states, with other markets in California, Quebec and Ontario
Committed to putting solar on 150,000 homes and businesses by 2020
Install renewable energy systems at all 64 State University of New York campuses
Lacks specific emissions reductions goals. While previously Governor Ritter had set a goal of 80% emissions reductions by 2050, Hickenlooper’s plan skirts even referencing these goals specifically
Proposes no new initiatives to reduce greenhouse gas emissions
Celebrates mining, oil and gas as pillars of Colorado’s economy
Projects that by 2030 Colorado’s emissions will increase 77% from 1990 levels
Where to from here for Colorado?
Despite its deficiencies, Hickenlooper’s plan proudly states that “Colorado is on the right track.” So where will the Governor’s business-as-usual approach take Colorado?
Interestingly, buried in the Plan itself is the answer. Colorado, famous for its beauty in all seasons, is on track for an average temperature rise of more than 6 degrees Fahrenheit by 2050, making seasonal temperatures in Denver most “closely resemble… Albuquerque.”
Colorado has already seen the devastating effects of climate change on our state: the ravages of pine beetle infestations, more intense floods and more destructive fires. If the Governor wishes to preserve the Colorado that Coloradans know and love, he ought to listen to what leaders on climate and renewable energy in New York and California are saying.
Since Governor Hickenlooper’s Energy Office has stressed that this plan is work in progress one can only hope that future versions of the plan include goals based in science and concrete actions to achieve those goals. States that are “on the right track” – New York and California – have these goals and initiatives. Colorado ought to as well.
If you’ve been paying much attention to the climate policy discussion over the last few years, you’ve probably heard mention of carbon budgets, or greenhouse gas (GHG) emissions budgets more generally. Put simply, for any given temperature target there’s a corresponding total cumulative amount of greenhouse gasses that can be released, while still having a decent chance of meeting the target. For example, the IPCC estimates that if we want a 2/3 chance of keeping warming to less than 2°C, then we can release no more than 1000Gt of CO2 between 2011 and the end of the 21st century.
The IPCC estimates that if we want a 2/3 chance of limiting warming to less than 2°C, then we can release no more than 1000Gt of CO2 equivalent between 2011 and the end of the 21st century.
The reason the IPCC and many other scientist types use carbon budgets instead of emissions rates to describe our situation is that the atmosphere’s long-term response to GHGs is almost entirely determined by our total cumulative emissions. In fact, as the figure below from the IPCC AR5 Summary for Policymakers shows, our current understanding suggests a close to linear relationship between CO2 released, and ultimate warming… barring any wild feedbacks (which become more likely and frightening at high levels of atmospheric CO2) like climate change induced fires vaporizing our boreal and tropical forests.
What matters from the climate’s point of view isn’t when we release the GHGs or how quickly we release them, it’s the total amount we release — at least if we’re talking about normal human planning timescales of less than a couple of centuries. This is because the rate at which we’re putting these gasses into the atmosphere is much, much faster than they can be removed by natural processes — CO2 stays in the atmosphere for a long time, more than a century on average. We’re throwing it up much faster than nature can draw it down. This is why the concentration of atmospheric CO2 has been marching ever upward for the last couple of hundred years, finally surpassing 400ppm this year.
So regardless of whether we use the entire 1000Gt budget in 20 years or 200, the ultimate results in terms of warming will be similar — they’ll just take less or more time to manifest themselves.
Unfortunately, most actual climate policy doesn’t reflect this reality. Instead, we tend to make long term aspirational commitments to large emissions reductions, with much less specificity about what happens in the short to medium term. (E.g. Boulder, CO: 80% by 2030, Fort Collins, CO: 80% by 2030, the European Union: 40% by 2030). When we acknowledge that it’s the total cumulative emissions over the next couple of centuries that determines our ultimate climate outcome, what we do in the short to medium term — a period of very, very high emissions — becomes critical. These are big years, and they’re racing by.
Is 1000Gt a Lot, or a Little?
Few normal people have a good sense of the scale of our energy systems. One thousand gigatons. A thousand billion tons. A trillion tons. Those are all the same amount. They all sound big. But our civilization is also big, and comparing one gigantic number to another doesn’t give many people who aren’t scientists a good feel for what the heck is going on.
Many people were first introduced to the idea of carbon budgets through Bill McKibben’s popular article in Rolling Stone: Global Warming’s Terrifying New Math. McKibben looked at carbon budgets in the context of the fossil fuel producers. He pointed out that the world’s fossil fuel companies currently own and control several times more carbon than is required to destabilize the climate. This means that success on climate necessarily also means financial failure for much of the fossil fuel industry, as the value of their businesses is largely vested in the control of carbon intensive resources.
If you’re familiar with McKibben’s Rolling Stone piece, you may have noticed that the current IPCC budget of 1000Gt is substantially larger than the 565Gt one McKibben cites. In part, that’s because these two budgets have different probabilities of success. 565Gt in 2012 gave an 80% chance of keeping warming to less than 2°C, while the 2014 IPCC budget of 1000Gt would be expected to yield less than 2°C warming only 66% of the time. The IPCC doesn’t even report a budget for an 80% chance. The longer we have delayed action on climate, the more flexible we have become with our notion of success.
Unfortunately this particular brand of flexibility, in addition to being a bit dark, doesn’t even buy us very much time. If we continue the 2% annual rate of emissions growth the world has seen over the last couple of decades, the difference between a budget with a 66% chance of success and a 50% chance of success is only ~3 years worth of emissions. Between 50% and 33% it’s only about another 2 years. This is well-illustrated by some graphics from Shrink That Footprint (though they use gigatons of carbon or GtC, instead of CO2 as their unit of choice, so the budget numbers are different, but the time frames and probabilities are the same):
Like McKibben’s article, this projection is from about 3 years ago. In those 3 years, humanity released about 100Gt of CO2. So, using the same assumptions that went into the 565Gt budget, we would now have only about 465Gt left — enough to take us out to roughly 2030 at the current burn rate.
There are various other tweaks that can be made with the budgets in addition to the desired probability of success, outlined here by the Carbon Tracker Initiative. These details are important, but they don’t change the big picture: continuing the last few decades trend in emissions growth will fully commit us to more than 2°C of warming by the 2030s. 2030 might sound like The Future, but it’s not so far away. It’s about as far in the future as 9/11 is in the past.
It’s encouraging to hear that global CO2 emissions remained the same in 2014 as they were in 2013, despite the fact that the global economy kept growing, but even if that does end up being due to some kind of structural decoupling between emissions, energy, and our economy (rather than, say, China having a bad economic year), keeping emissions constant as we go forward is still far from a path to success. Holding emissions constant only stretches our fixed 1000Gt budget into the 2040s, rather than the 2030s.
If we’d started reducing global emissions at 3.5% per year in 2011… we would have had a 50/50 chance of staying below 2°C by the end of the 21st century. If we wait until 2020 to peak global emissions, then the same 50/50 chance of success requires a 6% annual rate of decline. That’s something we’ve not yet seen in any developed economy, short of a major economic dislocation, like the collapse of the Soviet Union. And unlike that collapse, which was a fairly transient event, we will need these reductions to continue year after year for decades.
The Years of Living Dangerously
We live in a special time for the 2°C target. We are in a transition period, that started in about 2010 and barring drastic change, will end around 2030. In 2010, the 2°C target was clearly physically possible, but the continuation of our current behavior and recent trends will render it physically unattainable within 15 years. Barring drastic change, over the course of these 20 or so years, our probability of success will steadily decline, and the speed of change required to succeed will steadily increase.
I’m not saying “We have until 2030 to fix the problem.” What I’m saying is closer to “We need to be done fixing the problem by 2030.” The choice of the 2°C goal is political, but the physics of attaining it is not.
My next post looks at carbon budgets at a much smaller scale — the city or the individual — since global numbers are too big and overwhelming for most of us to grasp in a personal, visceral way. How much carbon do you get to release over your lifetime if we’re to stay with in the 1000Gt budget? How much do you release today? What does it go toward? Flying? Driving? Electricity? Food? How much do these things vary across different cities?
Featured image courtesy of user quakquak via Flickr, used under a Creative Commons Attribution License.
Gwen Hallsmith, the Executive Director of the Public Banking Institute will present on how public banking could play a role in financing a municipal electric utility and how Boulder can use public banking to enhance community wealth, resiliency, entrepreneurial participation and economic vitality. She will discuss the main advantage of public banking: lower-cost financing which enables states, counties and cities to better fund small business, infrastructure and projects such as affordable housing, libraries, farm-to-table agriculture, renewable energy, energy efficiency and public transportation. Each of these projects creates good local jobs. In these ways, public banks enable cities, counties, and states to better finance public priorities without relying on Wall Street or paying the high interest rates that pad big bank profits.
Gwen most recently made national headlines with her work in Vermont to ask Town Meetings to consider public banking. On March 4th, 18 cities and towns in Vermont voted to endorse a resolution directing the state legislators to create a State Bank for Vermont. Thanks to the media expertise of William Boardman and Matt Stannard, the national media has picked up on the story, and there have been now over 20 radio interviews, print stories, and starting this week we’ll be on syndicated television with the story as well… Gwen has an interview with GritTV on Tuesday, and we understand that even bigger shows are working on the story – stay tuned.
Gwen is the author of several books on sustainable communities and economic reform, including her most recent book with Bernard Lietaer called Creating Wealth: Growing Local Communities with Local Currencies. She has been an advocate for economic reform for over 25 years, and implemented new currency projects on the local level in her recent position as the Director of Planning and Community Development for the City of Montpelier. Her work spans the globe – she has worked in all the major world regions at this point, and with cities, towns, regions, provinces, and states in the United States and Canada.
Her vision for the Public Banking Institute expands our horizons to include many other aspects of a public monetary system, everything from strengthening the possibilities for local investment that the new SEC regulations allow to fostering and supporting complementary currencies for local and regional means of exchange. Her deep commitment to local action matches our vision for the Institute as a source of technical assistance, training, and research for all the state, regional, and local initiatives underway to set up public banks and other currency and investment initiatives.
One of the main reasons utilities fight distributed generation like rooftop solar is that it erodes demand for their centrally generated electricity. Reduced demand is annoying for any business, but it’s especially bad for traditional monopoly utilities. It’s especially bad because much — even most — of the cost of producing a kWh of electricity doesn’t go away if you don’t produce that kWh of electricity. These so-called “fixed” or “non-production” costs come from multi-decade financial commitments to big pieces of infrastructure — the power plants, transmission lines, and distribution systems.
So when you put solar panels on your roof and reduce the amount of electricity you need to buy from the utility, there’s a little bit of fuel that doesn’t get burned, and a little bit of money saved on the utility side (but as we’ve pointed out before, they don’t actually benefit from that cost savings), but a lot of the money that the utility spent to be able to provide you with electricity if you needed it is already spent. This is problematic because most electricity rates are designed to recover utility costs in proportion to the amount of electricity you buy (this type of rate is known as a “volumetric rate”). So utilities have an incentive (known as the throughput incentive) to ensure that their electricity sales increase, or at the very least don’t decline.
If lots of people start buying much less electricity, this reduces utility spending on things like fuel, but it doesn’t have any effect (in the short term) on the fixed or non-production costs. To stay solvent, the utilities then go back to their regulators and say “Hey, we’re not getting enough revenue to cover our costs. Give us a rate hike!” and if the regulators agree, allowing the utilities to recover the same fixed costs from fewer overall kWh of electricity sold, this just makes it even more financially sensible for people to put solar panels on their roof, to avoid buying the more expensive electricity. (And in our fantasy world, one could also imagine savvy regulators taking measures to decrease fixed costs, by forcing early retirement of risky, uneconomic fossil generation…)
This is the essence of the Utility Death Spiral that’s gotten so much attention over the last year or two (including a speakeasy we hosted), and which Dave Roberts did a great job of exploring in his Utilities for Dummies series over at Grist. From the Utility’s point of view the Death Spiral can be short-circuited with revenue decoupling… up to a point. With decoupling, they don’t have to go to regulators and ask for a rate hike — they can recover the fixed costs in a formulaic way, and so decoupled utilities are able to invest in energy efficiency without worrying about lost revenues. They’re also likely to be less opposed to modest amounts of distributed generation.
In fact, it’s hard to imagine a climate-aware utility of the future that isn’t decoupled. We need to get away from utilities treating electricity (and energy more generally) as a commodity, with profits tied to the quantity of product they sell. Instead, we need to move toward treating energy as a service — Amory Lovins’ famous hot showers and cold beer — with an incentive to provide high quality service using the least possible amount of underlying energy.
Decoupling is a Good Thing™
However, if you care about climate, then you always have to ask not just Is this a good thing? but Is this good enough? It’s an old cliché that “better is the enemy of good enough,” — i.e. spending time and money and effort on improvement beyond what’s good enough can be wasteful. But in the context of climate, we have the opposite problem. Moving things in the right direction can still mean abject failure. Plenty of things that are better than the status quo — like decoupling utility revenues, or burning natural gas instead of coal — come nowhere close to being good enough to keep us from seeing more than 2°C of warming.
To have a chance of stabilizing the climate, the utility business model can’t just be tinkered with. It needs to be radically transformed. The good news is that radical transformation is probably on the table whether the utilities want to talk about it or not. Our task is to make it happen as quickly and smoothly as possible.
Utility Death Spiral: Not Just for the Paranoid
Until very recently anybody afraid of the death spiral dynamic might have seemed a little paranoid. DG was still pretty expensive, and often dependent on utility rebate programs, tax credits, and other incentives that were often controlled by regulators and utilities. As the price of distributed solar has fallen, rebates have dwindled to nothing, and new financing mechanisms and business models have emerged. Utilities and regulators have lost some of their ability to moderate deployment, and they’re poised to lose much more.
Mosaic has created a peer-to-peer lending platform that lets individuals invest in diversified portfolios of smaller distributed solar projects, earning around a 5% return on their investments. They’ve done about $10M worth of financing this way. Now they’re getting into solar loans with backing from a large international re-insurer, adding another $100M in capital.
Sungage just raised $100M in funding from a large northeastern US credit union to use as a revolving solar loan fund.
SolarCity has started issuing solar bonds with a similar yield directly to the public on a much larger scale. They’ve raised more than $100M so far, without going through the traditional finance industry.
Big time sprawling suburban home builder Lennar is now installing rooftop PV systems by default in some markets, including around Denver. They’re offering home buyers a power purchase agreement (PPA) in which they get a 20% discount off of retail electricity rates for 20 years.
From the consumer’s point of view what this means is that in an increasing number of markets, rooftop solar can now be had at a discount to utility power, with no up front costs. This is new and different and scary for utilities, because it means rooftop solar can go big. Fast. Additionally, Elon Musk (who heads both electric car maker Tesla Motors and SolarCity…) is investing $5 billion (with a B) in a massive lithium ion battery factory in Nevada, hoping to drive costs down through economies of scale.
Suddenly, a good chunk of the traditional utility customer base starts to look a little sketchy.
In Colorado (and elsewhere) these dynamics have brought us to a regulatory stalemate. For once the status quo — net metering — favors distributed renewable electricity. It’s the policy that Big Solar has bet the farm on. But if we try and use it to scale up cheap rooftop PV dramatically, it may destabilize the utilities.
Straight net metering also won’t result in a particularly optimal deployment of distributed energy resources, because all it accounts for is energy production, and there are many more subtle qualities that are important to a well functioning electricity grid. If we can integrate those other qualities — temporal, geographic, environmental, price stabilization, etc. — into our electricity pricing we’ll get a much better overall outcome. As the Rocky Mountain Institute has put it: the debate over net metering misses the point.
Be that as it may, right now there are two 800lb gorillas (or maybe, an 800lb gorilla and a 300lb gorilla) locked in mortal combat — the utilities on one side and Big Solar on the other. One side is trying to get rid of net metering altogether, and the other is willing to fight to the death to preserve it. When people bring up other ways of valuing distributed renewable energy like Minnesota’s proposed Value of Solar or Feed in Tariffs they tend to either be ignored or attacked, sometimes by both sides of the fight! For example, The Alliance for Solar Choice wasted no time in setting up a campaign to stop what they glibly re-termed Feed in Taxes and Value of Solar Taxes as soon as Minnesota made it clear they were considering Value of Solar seriously.
Headed for Strange Country
As with so many aspects of climate and energy policy, change here is inevitable. Regardless of which side prevails in the fight over net metering, as the cost of distributed solar and energy storage continue to decline, we are headed for strange territory.
If the utilities prevail and repeal net metering, they’ll probably slow the spread of distributed generation, since customers would only be able to benefit economically from satisfying their electricity demand on-site in real time, rather than banking electricity production annually. But in the longer term, given ongoing PV system cost declines and the potential for cost-effective electricity storage, the utilities will still face a decline in electricity demand regardless of whether a policy like NEM remains in place. At one extreme we could end up in a situation (well described by RMI), where defection from the grid is economically sensible for a significant number of people.
On the other hand if Big Solar prevails then we get to the same place, maybe a little quicker, since they’re already operating with a net metering based business model at significant scale. If the Feds don’t renew the Investment Tax Credit in 2016 that will push the economics out a little, but there’s little reason to think the overall price trend is going to reverse. Ever.
Does that sound ridiculous? Then note that PV in 2014 is already 59% cheaper than NREL predicted it would be back in 2010, and Deutsche Bank is forecasting that solar will reach grid parity nationwide by the end of 2016. On the wholesale side the New York Times reports that without subsidies wind on the high plains has come in as low as ¢3.7/kWh (the same as just the production costs of Xcel’s Colorado fossil fleet in 2013).
Some folks think widespread grid defection sounds like utopian energy independence. In practice it would be far less equitable, more expensive, and operationally much less robust than a well designed network that integrates a lot of distributed energy. It’s also physically impossible in cities, which consume most of our electricity, because no matter how cheap solar and storage become, cities use more energy within their boundaries than is available from renewable sources in those same boundaries. This is despite the fact that cities have much lower per capita energy use than rural and suburban places of comparable wealth. Cities are great for the climate, but they will always need to import energy, and that means we will still need transmission and distribution systems.
Um, okay. But, decoupling?
In the near term, revenue decoupling would insulate Xcel against the sales they’re going to lose to rooftop solar and other distributed energy. Rather than seeing revenues decline as more electricity sales are displaced, they’d be empowered to adjust rates in a formulaic way to compensate for the losses, and ensure that the fixed costs of the grid continue to be paid for (along with their profits). In theory, this ought to remove or at least reduce their opposition to net metering.
In the long term, if grid defection becomes attractive, additional fixed-cost recovery mechanisms like revenue decoupling aren’t going to be much help to the utility.
Our task is to open up the discussion about creating an intelligent grid with electricity prices that reflect the more subtle attributes of distributed generation. Revenue decoupling is one potential avenue into that discussion — at least the early part of it. How so?
In the short term, the utilities are fighting for the status quo, minus net metering, and they seem to be losing. If the only two positions available are the status quo with vs. without net metering, the choice for renewable energy and climate advocates is clear — we have to side with Big Solar. But if utilities were actually up for creating a different — and much more scalable — renewable energy policy, then the decision of who to work with becomes more challenging.
With revenue decoupling in place, utilities like Xcel could have more room to consider policies that support distributed generation, without seeing them as an axiomatic threat to their revenues. But to do so, they’d have to be willing to talk about unwinding their existing investments in fossil generation — otherwise, no renewable or distributed generation policy can scale up far enough to be “good enough” for the climate. That vital discussion about unwinding fossil plants is not yet happening out in the open. At least, not in the US. We’ll take a much closer look at it in a post very soon!
Accelerating the transition from fossil fuels to a clean energy economy