Sunday, May 17, 2015

ENERGY FOR FUTURE PRESIDENTS (Chapter 3)

Muller, Richard A. Energy for Future Presidents: The Science Behind the Headlines. New York: W. W. Norton and Company, 2012.

The print version of this book is 368 pages in length.

In this book, the author presents section lessons to future presidents on various sectors of energy use and alternative energy prospects with a goal of clarifying, correcting, and expanding on information behind the news headlines.  From the author’s perspective, the president has the responsibility to be knowledgeable about these areas and that he or she should be a “teacher” to the public when it comes to using information that go beyond the news headlines to make informed decisions about energy. He tackles a wide-ranging list of energy and energy-related topics including: energy-related disasters, global warming, shale oil, alternatives to transportation fuel, energy efficiency and conservation, solar energy, wind energy, energy storage, nuclear power, biofuels, synfuels, hydrogen fuel, hybrid autos, and carbon dioxide mitigation measures.

I chose this book despite the broad coverage because energy is a shared purview of both physics and chemistry.  The theme of the book is on looking at headlines and providing a scientific and mathematical perspective to inform people’s interpretation and perception of these issues.  These are the same headlines that invariably both the president, myself, and my students read every day that are, for many, the primary source of information.

In Part I, the author provides his perspectives on 3 major energy catastrophes, presenting some facts, his interpretation of the risks and ramifications, and his opinion on how these should information government decisions and response. 

The first chapter deals with the Fukushima reactor meltdown following damage from the earthquake and tsunami of March 2011.  He predicts that the number of deaths from cancer from the radiation will be small, less than 1%, of the human death toll caused by the actual earthquake and tsunami. On the basis of this, he proposes that nuclear reactors should be built strong enough so that fewer deaths results from its damage through radiation than those resulting from the cause of the damage.  He also proposes using the average annual radiation dose that people in Denver get as a standard to determine what the disaster response should be during a radiation release.  Against these two standards, he argues that the Fukushima was actually adequately based on the low human death toll projected despite the fact that it was not designed to withstand a 9.0 earthquake and a 50-foot tsunami.

In Chapter 2, the author questions the President’s characterization of the Gulf Oil Spill of 2010 caused by the Deepwater Horizon oil rig accident as being the “greatest environmental disaster” in history. He argues that the ensuing animal death at around 6,000 was small relative to the hundreds of millions of bird deaths due to collision with glass windows and high-voltage lines.  The beaches remained relatively clean compared to the damage done by the Exxon Valdez spill to the Alaskan shores.  He “senses” that the overreaction did more damage to the region in terms of its effect on tourism and the local economy.

Chapter 3 covers quite a bit of material starting from the author’s presentation of his group’s efforts to confirm temperature increase data.  His group through the organization Berkeley Earth Surface Temperature project did extensive analysis of temperature data previously not included in the IPCC analysis and re-analysis of temperature (1.6 billion temperature measurements, 14 data sets, 38 stations), putting in measures to avoid data selection and correction bias and station quality bias and tested for urban heat bias. To the author’s surprise, they came up with the same temperature rise reported by the IPCC of 0.9 Celsius over land concluding that “none of the legitimate concerns of the skeptics had improperly biased the prior results” suggesting to the author that “those groups had been vigilant in their analysis and treated the potential biases with appropriate care”. Furthermore, they demonstrated a close agreement between the temperature rise curve and the carbon dioxide rise curve when smooth fitting was done with volcanic eruption data. The excellent precise fit between the temperature and CO2 curves “suggests that most – maybe all – of the warming of the past 250 years was caused by humans” according to the author.  Based on these results, the author offers the following prediction: if the CO2 concentration increases exponentially and the greenhouse gas effects increase logarithmically, then the warming should grow linearly: doubling the time interval, doubles the temperature rise.  For example, assuming exponential growth of CO2 concentration, by 2052, CO2 concentration would be doubled to 560 ppm.  The corresponding rise in land temperature is 1.6 Celsius. 40 years after 2052, there will be an additional 1.6 Celsius rise and so on every 40 years until the CO2 rise is mitigated.  In the section on tipping points, the author discusses some positive and negative feedbacks that may occur as a result of increased CO2 and warming. A strong positive feedback can lead to runaway greenhouse warming. The tipping points for this to happen that have so far been identified are loosening of the Antarctic ice sheet and slipping into the sea to produce over 100 feet of sea level rise; melting of freshwater in Greenland which can disrupt the Gulf Stream and change sea current flow all the way in the Pacific; melting of permafrost and release of the potent greenhouse gas methane leading to further warming; and release of methane from the seabed as the Arctic water warms. An example of a negative feedback is an increase in water vapor cloud cover, a mere 2% increase in which can cancel any expected further warming if the CO2 concentrations double.  The author believes that the only solid evidence of warming is the temperature data; all other effects attributed to warming are “either wrong or distorted”.  In his section, he presented his views on these effects and how they may or may not be accurate or correlated with warming temperatures: hurricanes, tornadoes, polar warming, the so-called hockey data, and sea level rise. Toward the end, the author asks the question, “Can global warming be stopped assuming it is a threat?” He highlights the important role of the developing nations in decreasing CO2 emissions even though most of what is in the atmosphere now was due mostly to developed nations.  The emerging economies need to cut emission intensity by 8 – 10% per year just to stabilize greenhouse emissions. Low-cost solutions and a switch from coal to natural gas are required to help China and other emerging nations cut emissions. The author believes that geoengineering solutions may not be taken seriously ever because of the danger of further altering the earth’s geochemistry and atmospheric chemistry without knowing the ultimate consequences. Lastly, on the global warming controversy: the author’s statement is this “The evidence shows that global warming is real, and the recent analysis of our team indicates that most of it is due to humans”.  He refers to global warming as both a scientific conclusion and a secular religion for both what he calls “alarmists” and “deniers”. He believes that it is a threat that needs to be addressed even if quantification is difficult to do.  He proposes that any solution should be inexpensive enough because it is the developing world that would need it the most.  The lowest-hanging fruit right now is a switch from coal to natural gas while technologies are developed to make other sources affordable.  An electric car is an expensive solution that produces more CO2 if the electricity is provided by a coal-powered plant.


Richard A. Muller is a professor of physics at the University of California, Berkeley. He is the best-selling author of Physics for Future Presidents and The Instant Physicist. He and his wife live in Berkeley, California.

READING NOTES

PART I: ENERGY CATASTROPHES
·         Energy use in the United States alone is huge: about 20 million barrels of oil each day. Because of these huge numbers, energy accidents normally make it on the news in a big way as well.
·         In this section, the author tackles 3 major energy catastrophes and offers facts and a suggestion on how to interpret the ramifications of these accidents.
·         “We need to get our facts right, put the consequences in perspective, clear up misimpressions, and get to the core of what really happened, or is still to happen.”


Chapter 1: Fukushima Meltdown
      In March 2011, a huge earthquake measuring 9.0 on the Richter scale hit Japan generating a tsunami 30 feet high and up to 50 feet in some places.  About 15,000 people died and 100,000 buildings destroyed.
      One of the recipients of the huge amount of energy unleashed by this earthquake through a 50-foor tsunami is the Fukushima Nuclear Reactor. At the site, two people died due to the earthquake and 1 due to the tsunami. No known deaths were reported due to the nuclear meltdown that ensued as a result of the impact.
      Nuclear energy releases are huge: fission of an atom of Uranium 235 can produce 20 million times the energy released in the decomposition of a molecule of TNT.
      Along with energy, high energy neutrons are also released which is the basis for the enormously rapid and huge energy explosions that fissile material is capable of.  In a nuclear reactor, the energy production must be moderated: only 4% of the uranium fuel is uranium-235 and neutron absorbers such as carbon or water are employed to slow down the reaction (only one of the emitted neutron triggers a new fission)  but still maintain a steady release of energy.
      Reactivity accidents result from runaway chain reactions when the process undergoes uncontrolled fission, which starts slowly at first and builds-up to an energy density that then results in a powerful explosion.
      In the Chernobyl reactivity accident of 1986, what killed most people was the radioactivity released and not the reactor explosion. In the Fukushima incident, the reactor did not explode and pumps kept working to cool down the heat produced from the residual radioactivity after the reactor shutdown upon impact. The cooling pumps stopped working after 8 hours without any external source of power to keep it going due to the loss of electricity because of extensive infrastructure failure.  Without the cooling pumps, the fuel overheated and melted resulting in a release of radioactivity second only to the Chernobyl accident.
      The most dangerous radioactivity released is that from iodine – 131 and cesium – 137. I – 131 has a half-life of 8 days and decays rapidly releasing radioactivity as it does making it the biggest source of radioactivity initially.  When it enters the body, it accumulates in the thyroid where it can cause cancer.  I – 131 absorption by body can be mitigated by taking potassium iodide; normal iodine from this salt saturates the thyroid and prevents or slows down the absorption of the radioactive isotope.
      Cs – 137decays more slowly so its initial impact is lower but it lasts longer.  Its half-life is 30 years.
      Sr – 90 also undergoes a slow decay. The slow decay means they are around longer and can deposit and accumulate in plants and animals that are consumed, concentrating in bones.
      An exposure of 100 rem or more will cause immediate radioactive illness (nausea, weakness, loss of hair); at 250-350 rem, 50% chance of death if untreated.
      The author offers the following formula for estimating deaths due to cancer: (population x average) / 2500.  Tin the example he gave, this formula estimated that a population of 22,000 and an average exposure of 22 rem may result in an excess of 194 extra cancers.  To give some perspective, a 20% incidence rate of cancer for a population of 22,000 is about 4,400 cancers.  Even though the number of cancers caused by radioactivity is less than 5%, they probably will be detectable as most of them will be thyroid cancers due to radioactive iodine exposure.
      Natural levels of exposure to radiation is about 0.3 rem from cosmic radiation and from uranium, thorium, and naturally radioactive potassium in the ground and another 0.3 rem from x-rays and other medical treatments. In Denver, Colorado, add another 0.3 rem from radon emitted from tiny concentrations of uranium in granite. Despite this, Denver has a lower cancer rate than the rest of the US and that the probability of dying from cancer is 0.00012, a number so small that prompts him to rhetorically ask “Should an undetectable danger play a major role in determining policy?” He further states that the International Commission on Radiologic Protection recommends evacuation when the radiation dose exceeds 0.1 rem per year which is one-third the dose that Denver gets. Chernobyl used this threshold to mandate evacuations.
      The Fukushima Nuclear Reactor was not built to withstand a 9.0 earthquake and a 50-foot tsunami.
      Should the Fukushima accident be used as a reason for ending nuclear power?  The author offers the following guidelines: 1) “Make a nuclear power plant strong enough that if it is destroyed or damaged, the incremental harm is small compared to the damage done by the root cause.” And 2), the Denver dose should be used as the standard in planning a disaster response, e.g., the ICRP threshold for evacuation should be raised to at least 0.3 rem or 3 millisieverts.
      The author contends that the Fukushima reactor was designed adequately when viewed with these standards in mind.


CHAPTER 2: THE GULF OIL SPILL
·         The author takes a hard look at references made by the president and others to the Gulf Oil spill as the “greatest environmental disaster of all time” by offering some factual perspectives on the damage wrought by the spill.  The accident killed 11 people and injured 17 more.
o   6,000 dead animals due to oil spill versus 100 million  to a 1 billion bird deaths each year due to glass windows and another 100 million deaths due to high-voltage electric lines
o   Beaches remained clean because BP hired fishermen to distribute buoys and barriers and spread dispersants to break up the oil versus the oil and tar that covered the Alaskan shores during the Exxon Valdes spill.
·         Author’s description of the Deepwater Horizon Accident: The oil rig sits above 5,000 feet of water.  A flexible pipe 23,000 feet long connects it to the oil source 18,000 feet below the seafloor.  When the rig exploded, the pipe was damaged and oil started gushing out at 26 gallons per second. The leaking oil was not plugged until July 15, 2010. It is estimated that the spill released 250 million gallons or about a million cubic meters. Despite the continued flow of oil, the affected size did not increase any further; the author surmises that this was likely due to the oil evaporating, dispersing in the water, sinking, or being cleaned-up. On September 19, the well was officially sealed.
·         The author estimates that with a spill area of about 10,000 miles, if all this oil was dispersed uniformly in that volume of water, the resulting distribution would be less than 1 ppm, “below what is considered a toxic level”. The surfactants were added to break-up and prevent big blobs of oil from forming so that more of them are accessible to oil-digesting bacteria and they don’t gum up feathers of birds and animal fur.
·         Natural oil leaks do occur in the seabed but they probably are only about 1% of the Deepwater spill.
·         A year after the initial spill, measurements showed that 99% of the water samples tested in the entire region including 1,000 square miles closes to the wellhead had no detectable oil residue or dispersant.  Tourism was severely affected with one estimate claiming a loss of 23 billion dollars over the ensuing three years. A year later, however, the Governor of Louisiana declared the “region reborn”.
·         The author believes that the President’s characterization of the disaster is hyperbole and overreaction to the spill was even more damaging.


CHAPTER 3: GLOBAL WARMING AND CLIMATE CHANGE

·         The level of carbon dioxide in the last century has increased by 40% due to human use of fossil fuels.  Carbon dioxide makes up 0.04% of the atmosphere.  Water is a more significant greenhouse gas but we have no control over the amount that evaporates from bodies of water. Methane is also an important greenhouse gas.  The oxygen, argon, and nitrogen in the atmosphere are transparent to infrared radiation.
·         Physical calculations estimate that the earth’s temperature would be below freezing if not for the presence of greenhouse gases.
·         In 2007, the IPCC reported that global temperature rose by 064 Celsius in the previous 50 years.  During those same years, the land temperature rose by 0.9 Celsius. Land temperatures rise in greater amount because heat concentrates near the surface of the land while heat spreads down to depths of 100 feet in the ocean. In the same report, the IPCC states that global warming has been happening since the 1800’s but the anthropogenic distribution is hard to determine as part of that earlier warming was due to changes in the sun’s intensity.
·         Despite the smallness of this temperature rise, scientists including the author are more concerned about greater warming occurring in the future.
·         The author’s group through the organization Berkeley Earth Surface Temperature project did extensive analysis of temperature data previously not included in the IPCC analysis and re-analysis of temperature (1.6 billion temperature measurements, 14 data sets, 38 stations), putting in measures to avoid data selection and correction bias and station quality bias and tested for urban heat bias. To the author’s surprise, they came up with the same temperature rise reported by the IPCC of 0.9 Celsius over land concluding that “none of the legitimate concerns of the skeptics had improperly biased the prior results” suggesting to the author that “those groups had been vigilant in their analysis and treated the potential biases with appropriate care”.
·         See Page 76 (ibook version) for the group’s plot of the average global temperature rise over land from 1800 to the present. Dips in the otherwise rising temperature plot attributed to volcanic eruptions correlated with ice core measurements of sulfate particles. There was a close agreement between the temperature rise curve and the carbon dioxide rise curve when smooth fitting was done with volcanic eruption data, better than the author’s attempt at using a parabola and other polynomial fit. “Our fit shows that one could ignore these (sunspot) cycles and get an excellent explanation of most of the data considering only carbon dioxide and volcanoes”.  The excellent precise fit between the temperature and CO2 curves “suggests that most – maybe all – of the warming of the past 250 years was caused by humans” according to the author.
·         Based on these results, the author offers the following prediction: if the CO2 concentration increases exponentially and the greenhouse gas effects increase logarithmically, then the warming should grow linearly: doubling the time interval, doubles the temperature rise.  For example, assuming exponential growth of CO2 concentration, by 2052, CO2 concentration would be doubled to 560 ppm.  The corresponding rise in land temperature is 1.6 Celsius. 40 years after 2052, there will be an additional 1.6 Celsius rise and so on every 40 years until the CO2 rise is mitigated.
·         The logarithmic dependence of the greenhouse effect on CO2 concentration stems from, according to the author, “the fact that most of the effect comes from the edges of the CO2 absorption lines which only broaden logarithmically”.
·         In the section on tipping points, the author discusses some positive and negative feedbacks that may occur as a result of increased CO2 and warming:
·         A strong positive feedback can lead to runaway greenhouse warming like the one that makes Venus a very hot planet. The tipping points for this to happen that have so far been identified are:
o   Loosening of the Antarctic ice sheet and slipping into the sea to produce over 100 feet of sea level rise
o   Melting of freshwater in Greenland which can disrupt the Gulf Stream and change sea current flow all the way in the Pacific
o   Melting of permafrost and release of the potent greenhouse gas methane leading to further warming
o   Release of methane from the seabed as the Arctic water warms
·         An example of a negative feedback is an increase in water vapor cloud cover, a mere 2% increase in which can cancel any expected further warming if the CO2 concentrations double.  Poor understanding of cloud cover mechanism contributes much to the uncertainty of warming predictions.
·         Local variability in temperature changes can mask the experience of global warming in different places.  About a third of temperature measurement stations report decreasing temperatures.  The author claims that a global increase of 2-3 Celsius will be felt globally and local temperature trends cannot negate it.
·         The author believes that the only solid evidence of warming is the temperature data; all other effects attributed to warming are “either wrong or distorted”.  He presents a review of some of these claims:
o   Hurricanes: increase in hurricane frequency more likely due to increased capacity to detect them even offshore.  Data for hurricanes that impact the US coast show no increase.  His conclusion “the rate of hurricanes hitting the US has not been increasing”.
o   Tornadoes: measurements show decreasing rate of tornadoes verified by statistical analysis. Global warming theory predicted that tornadoes might increase and not would increase; more storms may be generated due to energy availability in a warming climate.  However, it is temperature gradient that is more significant and not the absolute temperature in tornado formation. See graph.
o   Polar warming: Older climate models actually predict that Antarctic ice would increase and not decrease; higher rate of evaporation due to sea warming can increase the amount of snow falling in the Antarctic which stays below freezing even with warming temperatures. Satellite measurements showed however that the Antarctic has lost 36 cubic miles of ice.  Models were tweaked and were able to reproduce this actual result.  Modeling Antarctic ice can produce unreliable results because Antarctica only covers 2.7% of the globe, too small for more precise predictions.  The models and observations for the Arctic are consistent with each other: decreasing ice. The author states that it is difficult to determine the cause, global warming and/or decadal oscillations in sea surface temperature and pressure.
o   Hockey data: adjustment of temperature data, purportedly to “hide” data that seem to indicate decreasing temperatures by replacing proxy data with actual thermometer data.  See “Clinategate”.
o   Sea Level Rise: IPCC reports that sea level has risen by 8 inches in the last century (from records of tide levels).  The rise could be attributed to warmer waters which expand and the melting of glaciers. It is difficult to determine the ultimate cause.  The melting of glaciers in Greenland is attributed to soot pollution. IPCC predicts a further 1 – 2 feet rise in sea level through the remainder of the century.
·         “Can global warming be stopped assuming it is a threat?” A treaty requiring 80% cut in greenhouse emissions by US by 2080 and 70% cut in intensity by China and the developing world by 2040 are not going to result in decreased carbon dioxide atmospheric concentrations according to the author.  Under this treaty and the numbers involved, the author calculates that total atmospheric CO2 would increase to above 1,000 ppm (currently around 400 ppm) which, using IPCC models, would lead to a global temperature increase of 3 Celsius. In 2010, China’s CO2 emissions were 70% higher than the US. Its CO2 emission per capita is only 45% that of the US. President Obama did not sign the Copenhagen treaty because of China’s refusal to allow inspections.  China’s emissions intensity is now 5 times that of the US; with a 6% increase every year compared to the US rate, they will surpass US per capita emissions by 2025.  Because energy use correlates with wealth, the author rhetorically asks, “If you were the president of China, would you endanger progress to avoid a few decrees of temperature change”?  Slowing growth in China could trigger political instability, adds the author.  See figures 1.6 and 1.17.   “Every 10% cut in US emissions is negated by 6 months of China’s emission growth.” Reducing its dependence on coal and switching to natural gas can help in reducing China’s CO2 emissions (natural gas releases only half the CO2).  The author highlights the important role of the developing nations in decreasing CO2 emissions even though most of what is in the atmosphere now was due mostly to developed nations.  The emerging economies need to cut emission intensity by 8 – 10% per year just to stabilize greenhouse emissions. Low-cost solutions and a switch from coal to natural gas are required to help China cut emissions.
·         Geoengineering: some proposed solutions below.  The author believes these solutions may not be taken seriously ever because of the danger of further altering the earth’s geochemistry and atmospheric chemistry without knowing the ultimate consequences.
·         Dumping iron in the ocean to encourage plant growth
·         Cloud-seeding methods to increase cloud formation
·         Releasing sulfate particles into the stratosphere to rom aerosols that would reflect sunlight. “A simple calculation suggests that just one pound of sulfates injected into the stratosphere could offset the warming caused by thousands of pounds of carbon dioxide.

·         On the global warming controversy: the author’s statement is this “The evidence shows that global warming is real, and the recent analysis of our team indicates that most of it is due to humans”.  He refers to global warming as both a scientific conclusion and a secular religion for both what he calls “alarmists” and “deniers”. He believes that it is a threat that needs to be addressed even if quantification is difficult to do.  He proposes that any solution should be inexpensive enough because it is the developing world that would need it the most.  The lowest-hanging fruit right now is a switch from coal to natural gas while technologies are developed to make other sources affordable.  An electric car is an expensive solution that produces more CO2 if the electricity is provided by a coal-powered plant.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.