The capacity market, which provides payments to ‘keep the lights on’, is one of the energy policies still surviving after a summer of scrappages and watering down. The bidders for the second auction were announced at the end of last week, and, as before, the list is dominated by carbon heavy power stations: 48 per cent gas plants and 19 per cent coal plants (including Aberthaw, which raises the worrying prospect of public money supporting a power station that’s currently breaking pollution laws).
This is perverse: new evidence from the government’s own trials show that negawatts, which can cut power demand at peak times, are a far cheaper (and lower carbon) means of keeping the lights on. But the government has bungled its presentation of the numbers, making watts look cheaper than negawatts. Here’s how to untangle the evidence.
The results of the first capacity market auction in 2014, and the first Electricity Demand Reduction (EDR) pilot auction – a separate, demand side only capacity auction – in 2015, seemed to suggest that capacity market watts were far cheaper than EDR pilot negawatts. The capacity auction cleared at £19.40 per kW per year, while the EDR auction’s average bid was £229 per kW per year.
How could UK negawatts be costing so much?
Green Alliance has promoted negawatts for a while now, and we were initially disheartened by what seemed to be a clear case of watts being cheaper than negawatts. How could we have got things so wrong? We were confused: the UK’s results fly in the face of international evidence. A survey of 20 efficiency programmes in the US found the average cost of negawatts was around £30 per MWh (converted from $46 per MWh). Compare that to the latest DECC estimate of the levelised cost of a new gas plant: around £80 per MWh (though falling gas prices mean a more up to date figure would be lower, perhaps in the region of £70 per MWh). Even accounting for the fact that the capacity market only pays a part of the levelised cost of generation, how could the UK be so different from the US?
It turns out it isn’t; it’s all a matter of accounting. EDR measures reduce demand all year round, but the EDR pilot only pays for electricity savings at peak times. Because power stations make money from selling electricity as well as from capacity payments, they have two sources of revenue. In contrast, the EDR pilot’s measures can only make money from the auction. So a fair comparison means covering the total cost to consumers: the electricity cost and the capacity payment.
When we looked at the costs over the EDR measures’ lifetimes, we saw a different picture. Let’s assume that the efficiency measures in the EDR pilot (almost all of which were improvements to lighting) cut demand for 12 hours a day (when it’s dark) over three years. Crunching the numbers, the costs of EDR are just £4.35 per MWh, well below even the marginal cost of generation for fossil power stations.
This implies that a) negawatts are far cheaper than watts, and b) the current auction system is only securing a fraction of the economically viable negawatts. For the upcoming second phase of the EDR pilot, DECC has made some positive changes to some of the scheme’s limitations, but it has reduced the already tiny budget – £10 million per year vs £180 million for the capacity market – by 40 per cent.
This seems like a missed opportunity, because the UK has enormous technical potential for reducing its electricity demand. Previous Green Alliance analysis conservatively estimated a saving of 6.4GW by 2030, equivalent to the capacity of eight 800MW combined cycle gas turbine (CCGT) power stations, from demand reduction alone. The first EDR auction procured 5.6 MW of savings, a rather small first step considering the 6.4GW opportunity.
So there you have it. The mystery of the UK’s peculiarly expensive negawatts is solved, but the bigger question of why we’re persisting with an expensive, high carbon, outdated capacity market structure remains. Something else to add to your list of energy policy phenomena to be puzzled about this year.