Climate Change Report: Clean Power Plan is Unreliable, Erratic, and Expensive

Climate Change Right Side News

By Ken Haapala, President, Science and Environmental Policy Project (SEPP)

Full PDF Report Here

Administration’s Power Plan

Independent analysts continue to provide details of the Obama Administration’s politically named “Clean Power Plan” (CPP). These studies make clear that the only forms of new electrical power generation the administration considers “clean” are solar and wind. Electric power generation from fossil fuels are condemned by the administration. Hydroelectric generation is out of favor, as explained by ex-EPA official Alan Carlin. There are no plans for federally supported new dam construction in the US. In fact, the thrust has been to tear down existing dams in the name of the environment.

Nuclear energy, which produces no carbon dioxide (CO2) is not an option. The administration mothballed the nuclear waste repository at Yucca Mountain and has not offered solutions for an alternative. Indeed, in 2009 the EPA published in the Federal Register a rule limiting radiation doses from Yucca Mountain for up to 1,000,000 years after it closes, demonstrating the absurd durations the administration considers its edicts are enforceable. Biomass burning on a large scale would require clearing the forests, as was done in the eastern US in the 18th and 19th centuries, which would be politically unacceptable.

This leaves only solar and wind as the major sources of electrical power generation. Both are unreliable, erratic, and expensive. The Administration’s concept would be more appropriately termed the unreliable power plan.

Even with its plans to prevent new, reliable electrical-power generation, a report by the Institute for 21st Century Energy of U.S. Chamber of Commerce finds the plan falls far short of the goals set by Mr. Obama.

“Even with these fairly generous estimates, these measures, which include some programs that haven’t even been announced yet, would fall about 800 MMTCO2 [Million Metric Tons of CO2], or 45%, short of the president’s goal. How does administration intend to plug the remaining gap? It hasn’t said. When asked by the Financial Times about the holes in the administration’s INDC [Intended Nationally Determined Contributions pledged for the UN-Conference of Parties (COP 21) in Paris in December], White House official Rick Duke chose to deny existence of a problem and instead change the subject: ‘Our numbers are quite clear. It’s other countries where we see more opportunities to clarify what the plans are.'” Boldface added.

We need other countries to define what our plans are? What will the administration do to fill the 45% shortfall is anyone’s guess? The report indicates that major industries should be on the alert. “Still, seeing as the entire industrial sector emitted a little over 800 MMTCO2 in 2013, even very steep cuts by industry won’t deliver nearly what’s needed”, according to the US Chamber.

Terry Jarrett, a former commissioner of the Missouri Public Service Commission, observed: “And if you’re skeptical of the threat posed by man-made CO2 in an ever-changing climate, then you’ll likely balk at the stunning price tag for this new set of rules, which the U.S. Chamber of Commerce estimates at an annual cost of $51 billion in lost GDP and 224,000 jobs lost.”

One can quibble about the numbers, but the direction is clear, the Administration is willing to damage an already weak economy (real growth rate of about 2% during the Administration), in order to fight global warming/climate change – an enemy so ill-defined that the Administration has failed to grasp the natural causes of climate change. See links under The Administration’s Plan – Independent Analysis, and The Administration’s Plan – Push-Back.

Needed Research: On his web site, Roy Spencer, co-founder of the method of measuring atmospheric temperatures by satellites, the only comprehensive, virtually global measurements existing, reported that: “As part of a DOE grant we are testing climate models against satellite observations, particularly regarding the missing ‘hotspot’ in the tropics, that is, the expected region of enhanced warming in the tropical mid- and upper troposphere as the surface warms. Since 1979 (the satellite period of record), it appears that warming in those upper layers has been almost non-existent, despite some surface warming and increases in low-level humidity.”

It is unclear if “we” refers to the entire group that reports global temperatures, based at the University of Alabama in Huntsville or not.

The research is much need. In its Second Assessment Report, (AR2 – 1996), the UN Intergovernmental Panel on Climate Change (IPCC) erroneously asserted the “hot-spot” was the distinct human fingerprint, which it is not. In 2007, Douglass, Christy, Pearson and Singer found the “hot spot” exists in the models, but not in observations. No one has been able to produce data establishing the “hot-spot.” The issue is more fully discussed at:…

Yet, it is a critical part of the EPA’s 2009 finding that human greenhouse gas emissions, particularly CO2, endanger human health and welfare. Without EPA’s finding, the Administration has no legal or scientific basis for severely restricting CO2 emissions as prescribed in its CPP. See link under Challenging the Orthodoxy.

Balancing the Load: One of the topics avoided by the promoters of wind and solar, including government officials, is the need for balancing the load on the electrical grid. That is, roughly equating consumption with generation. Too much electricity generated at one time will blow transformers, capacitors, and other devices designed to give the system stability. The system will fail and it may require some time before it can be repaired. Too little electricity generated at one time results in brown-outs, black-outs and other forms of failure. The load must be balanced constantly, and utility companies do so by engaging electricity providers, daily, on an as needed basis. The electricity provided is often far more expensive than electricity provided consistently. Conversely, excess electricity must be dumped at low prices.

The only major form of electricity storage in general use is pumped-hydro storage. This usually involves pumping water uphill from one reservoir at one elevation to another reservoir at higher elevation, (several hundred feet higher). From the second reservoir, the water can be drawn down through hydroelectric turbines to create power when needed. In general, the system loses about 20 to 30% of available power and requires large reservoirs. The largest such facility is in Bath County, Virginia. Unfortunately, EPA clean water regulations are making the new construction of such facilities very difficult, even where geologically feasible.

On her web site, Jo Nova has graphs showing the erratic nature Australian Wind Energy Production in July and first half of August. Similar patterns are found elsewhere such as West Denmark:…
and the Pacific Northwest (Bonneville Power Authority)…

Zero values are not unusual. From August 17 to August 23, 2015, wind power generation at Bonneville varied from zero to over 4,000 MW, most of the time near the bottom.

No amount of government edicts or regulations will stabilize the wind. In the US, the Administration’s and EPA’s power plan suppresses stable, reliable forms of electricity generation in favor of erratic and unreliable solar and wind; yet, other regulations by the EPA and Administration suppress the ability to stabilize erratic electrical power so generated. See links under Alternative, Green (“Clean”) Solar and Wind

Capacity Factors: Another topic that promoters of solar and wind seldom discuss is capacity factors, which is a measure of reliability. Nameplate capacity is often used by promoters who will often statements of maximum capacity, such as the facility will provide enough electricity to power 500,000 homes. But nameplate capacity is not particularly meaningful if the facility will power 500,000 home only 5 minutes a day. Preston Cooper of the Manhattan Institute discusses capacity factors of various energy sources in the U.S. By far, in 2013, the greatest average capacity factor was 90.9% for existing nuclear, meaning that the nuclear plants remain on line, generating electricity over 90% of the time. Of course, nuclear is being suppressed by the Administration.

The greatest capacity factor for renewables is geothermal at 67.2%. Certainly, geothermal works well in Iceland, but few urban areas are built where geological plates are separating. There are few locations for geothermal in the US.

Biomass burning at 67.1% has a higher capacity factor than coal (steam turbine) at 58.9%, or natural gas (combined cycle) at 50.3%. Unfortunately, these statistics can be misleading. Biomass is used largely at paper-making and wood pulp locations where the waste is burned at the location for electricity. Other than the paper and wood industries, Biomass means little. Since coal is used more for base-load, it has an indicated higher capacity than natural gas, which is often used for more inefficient shoulder and peak-load.

When used alone, the capacity factors of erratic sources of electricity such as solar and wind are misleading because the power is not always available when needed. See links under Alternative, Green (“Clean”) Solar and Wind.

The Hiatus Again? Science Magazine published an article by Kevin Trenberth of National Center for Atmospheric Research (NCAR), a well-known member of the climate establishment that helps generate the IPCC reports. Reading past the puffery, the article uses global mean surface temperatures to assert a staircase for rising temperatures since 1920. First a rise from 1920 to 1940, then a stable period (hiatus) to 1975, then a rise to 1998, then another inflection point reflecting a lower rate of rise from 1998 to present (all dates are approximate). Trenberth asserts that other than “human-induced climate change” the greatest driver of the temperature variation is the El Niño-Southern Oscillation in the Pacific Ocean. “The year 1998 was the warmest on record in the 20th century because of the 1997-1998 El Niño, the biggest such event on record.”

There are several issues with this analysis. One, a more traditional analysis would have the initial warming from about 1910 to 1940, followed by a modest cooling to 1975, followed an increase to 2003 (the 1997-98 El Niño year is ignored), followed by the current hiatus. The rate of the first warming is about the same as the rate of the second warming. This goes to the central point, what is the cause of the first warming? CO2 emissions were very low. A secondary point: is the cause of the second warming period different than the cause of the first warming period? The IPCC claims the second warming was caused by human greenhouse gas emissions, but offers no compelling evidence.

Another significant issue is the great inconsistency (since 1979) between global mean surface temperatures, with frequent adjustments, and the far more comprehensive satellite data, independently supported by measurements from weather balloons. Would Science Magazine published a similar study by Roy Spencer and John Christy using their data? All this undermines EPA’s claimed evidence that greenhouse gas emissions, particularly CO2, endanger public health and welfare. For the paper and other criticisms, see links under Defending the Orthodoxy.

Oil Glut? Watching those who predicted that oil prices will never fall back away from their predictions is more fun than watching those who predicated unprecedented and dangerous global warming back away from their predictions. With the first group, their predictions did not become part of US national policy. With the second group, their predictions are becoming an economically damaging part of US national policy.

Now, with the first group, instead of the world running out of oil, some analysts are forecasting an oil glut with the price dropping to $30 per barrel. The governments of petro-states, whose existence depends on high oil prices, should be worried.

With second group, western governments are insisting their policies are correct, the danger of human-caused global warming/climate change are established, regardless of the lack of evidence. The citizens of these countries, whose well-being depends on a rational, properly functioning government, should be worried. See Articles # 1 & # 2 and links under Energy Issues – Non-US

Merchants Again? Jo Nova and Lubos Motl reported that the film, “Merchants of Doubt”, failed at the box office’ but. It is now being considered for schools. In his youth, Motl lived under Communism and identifies the film as propaganda. In the US, Edward Bernays, “the father of modern advertising” “pioneered the scientific technique of shaping and manipulating public opinion, which he called ‘engineering of consent.'” During World War I, Bernays was an integral part of the US Committee on Public Information, which sold the war to the US public as necessary “to make the world safe for democracy.”

The opening chapter of his 1928 book, Propaganda, Bernays titles as “Organizing Chaos” with the opening paragraph stating:

“The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.”

Based on their book Merchants of Doubt by Oreskes and Conway and the recent film, one can ask: who are the masses that need to be manipulated? Could it be those who believe the book and the film? See links under Communicating Better to the Public – Use Propaganda and Communicating Better to the Public – Use Propaganda on Children