Saturday, May 16, 2015

ENERGY FOR FUTURE PRESIDENTS (Part I - Chapter 3 Partial)

Muller, Richard A. Energy for Future Presidents: The Science Behind the Headlines. New York: W. W. Norton and Company, 2012.

The print version of this book is 368 pages in length.

In this book, the author presents section lessons to future presidents on various sectors of energy use and alternative energy prospects with a goal of clarifying, correcting, and expanding on information behind the news headlines.  From the author’s perspective, the president has the responsibility to be knowledgeable about these areas and that he or she should be a “teacher” to the public when it comes to using information that go beyond the news headlines to make informed decisions about energy. He tackles a wide-ranging list of energy and energy-related topics including: energy-related disasters, global warming, shale oil, alternatives to transportation fuel, energy efficiency and conservation, solar energy, wind energy, energy storage, nuclear power, biofuels, synfuels, hydrogen fuel, hybrid autos, and carbon dioxide mitigation measures.

I chose this book despite the broad coverage because energy is a shared purview of both physics and chemistry.  The theme of the book is on looking at headlines and providing a scientific and mathematical perspective to inform people’s interpretation and perception of these issues.  These are the same headlines that invariably both the president, myself, and my students read every day that are, for many, the primary source of information.

In Part I, the author provides his perspectives on 3 major energy catastrophes, presenting some facts, his interpretation of the risks and ramifications, and his opinion on how these should information government decisions and response. 

The first chapter deals with the Fukushima reactor meltdown following damage from the earthquake and tsunami of March 2011.  He predicts that the number of deaths from cancer from the radiation will be small, less than 1%, of the human death toll caused by the actual earthquake and tsunami. On the basis of this, he proposes that nuclear reactors should be built strong enough so that fewer deaths results from its damage through radiation than those resulting from the cause of the damage.  He also proposes using the average annual radiation dose that people in Denver get as a standard to determine what the disaster response should be during a radiation release.  Against these two standards, he argues that the Fukushima was actually adequately based on the low human death toll projected despite the fact that it was not designed to withstand a 9.0 earthquake and a 50-foot tsunami.

In Chapter 2, the author questions the President’s characterization of the Gulf Oil Spill of 2010 caused by the Deepwater Horizon oil rig accident as being the “greatest environmental disaster” in history. He argues that the ensuing animal death at around 6,000 was small relative to the hundreds of millions of bird deaths due to collision with glass windows and high-voltage lines.  The beaches remained relatively clean compared to the damage done by the Exxon Valdez spill to the Alaskan shores.  He “senses” that the overreaction did more damage to the region in terms of its effect on tourism and the local economy.

Richard A. Muller is a professor of physics at the University of California, Berkeley. He is the best-selling author of Physics for Future Presidents and The Instant Physicist. He and his wife live in Berkeley, California.

READING NOTES

PART I: ENERGY CATASTROPHES
·        Energy use in the United States alone is huge: about 20 million barrels of oil each day. Because of these huge numbers, energy accidents normally make it on the news in a big way as well.
·        In this section, the author tackles 3 major energy catastrophes and offers facts and a suggestion on how to interpret the ramifications of these accidents.
·        “We need to get our facts right, put the consequences in perspective, clear up misimpressions, and get to the core of what really happened, or is still to happen.”


Chapter 1: Fukushima Meltdown
      In March 2011, a huge earthquake measuring 9.0 on the Richter scale hit Japan generating a tsunami 30 feet high and up to 50 feet in some places.  About 15,000 people died and 100,000 buildings destroyed.
      One of the recipients of the huge amount of energy unleashed by this earthquake through a 50-foor tsunami is the Fukushima Nuclear Reactor. At the site, two people died due to the earthquake and 1 due to the tsunami. No known deaths were reported due to the nuclear meltdown that ensued as a result of the impact.
      Nuclear energy releases are huge: fission of an atom of Uranium 235 can produce 20 million times the energy released in the decomposition of a molecule of TNT.
      Along with energy, high energy neutrons are also released which is the basis for the enormously rapid and huge energy explosions that fissile material is capable of.  In a nuclear reactor, the energy production must be moderated: only 4% of the uranium fuel is uranium-235 and neutron absorbers such as carbon or water are employed to slow down the reaction (only one of the emitted neutron triggers a new fission)  but still maintain a steady release of energy.
      Reactivity accidents result from runaway chain reactions when the process undergoes uncontrolled fission, which starts slowly at first and builds-up to an energy density that then results in a powerful explosion.
      In the Chernobyl reactivity accident of 1986, what killed most people was the radioactivity released and not the reactor explosion. In the Fukushima incident, the reactor did not explode and pumps kept working to cool down the heat produced from the residual radioactivity after the reactor shutdown upon impact. The cooling pumps stopped working after 8 hours without any external source of power to keep it going due to the loss of electricity because of extensive infrastructure failure.  Without the cooling pumps, the fuel overheated and melted resulting in a release of radioactivity second only to the Chernobyl accident.
      The most dangerous radioactivity released is that from iodine – 131 and cesium – 137. I – 131 has a half-life of 8 days and decays rapidly releasing radioactivity as it does making it the biggest source of radioactivity initially.  When it enters the body, it accumulates in the thyroid where it can cause cancer.  I – 131 absorption by body can be mitigated by taking potassium iodide; normal iodine from this salt saturates the thyroid and prevents or slows down the absorption of the radioactive isotope.
      Cs – 137decays more slowly so its initial impact is lower but it lasts longer.  Its half-life is 30 years.
      Sr – 90 also undergoes a slow decay. The slow decay means they are around longer and can deposit and accumulate in plants and animals that are consumed, concentrating in bones.
      An exposure of 100 rem or more will cause immediate radioactive illness (nausea, weakness, loss of hair); at 250-350 rem, 50% chance of death if untreated.
      The author offers the following formula for estimating deaths due to cancer: (population x average) / 2500.  Tin the example he gave, this formula estimated that a population of 22,000 and an average exposure of 22 rem may result in an excess of 194 extra cancers.  To give some perspective, a 20% incidence rate of cancer for a population of 22,000 is about 4,400 cancers.  Even though the number of cancers caused by radioactivity is less than 5%, they probably will be detectable as most of them will be thyroid cancers due to radioactive iodine exposure.
      Natural levels of exposure to radiation is about 0.3 rem from cosmic radiation and from uranium, thorium, and naturally radioactive potassium in the ground and another 0.3 rem from x-rays and other medical treatments. In Denver, Colorado, add another 0.3 rem from radon emitted from tiny concentrations of uranium in granite. Despite this, Denver has a lower cancer rate than the rest of the US and that the probability of dying from cancer is 0.00012, a number so small that prompts him to rhetorically ask “Should an undetectable danger play a major role in determining policy?” He further states that the International Commission on Radiologic Protection recommends evacuation when the radiation dose exceeds 0.1 rem per year which is one-third the dose that Denver gets. Chernobyl used this threshold to mandate evacuations.
      The Fukushima Nuclear Reactor was not built to withstand a 9.0 earthquake and a 50-foot tsunami.
      Should the Fukushima accident be used as a reason for ending nuclear power?  The author offers the following guidelines: 1) “Make a nuclear power plant strong enough that if it is destroyed or damaged, the incremental harm is small compared to the damage done by the root cause.” And 2), the Denver dose should be used as the standard in planning a disaster response, e.g., the ICRP threshold for evacuation should be raised to at least 0.3 rem or 3 millisieverts.
      The author contends that the Fukushima reactor was designed adequately when viewed with these standards in mind.


CHAPTER 2: THE GULF OIL SPILL
·        The author takes a hard look at references made by the president and others to the Gulf Oil spill as the “greatest environmental disaster of all time” by offering some factual perspectives on the damage wrought by the spill.  The accident killed 11 people and injured 17 more.
o   6,000 dead animals due to oil spill versus 100 million  to a 1 billion bird deaths each year due to glass windows and another 100 million deaths due to high-voltage electric lines
o   Beaches remained clean because BP hired fishermen to distribute buoys and barriers and spread dispersants to break up the oil versus the oil and tar that covered the Alaskan shores during the Exxon Valdes spill.
·        Author’s description of the Deepwater Horizon Accident: The oil rig sits above 5,000 feet of water.  A flexible pipe 23,000 feet long connects it to the oil source 18,000 feet below the seafloor.  When the rig exploded, the pipe was damaged and oil started gushing out at 26 gallons per second. The leaking oil was not plugged until July 15, 2010. It is estimated that the spill released 250 million gallons or about a million cubic meters. Despite the continued flow of oil, the affected size did not increase any further; the author surmises that this was likely due to the oil evaporating, dispersing in the water, sinking, or being cleaned-up. On September 19, the well was officially sealed.
·        The author estimates that with a spill area of about 10,000 miles, if all this oil was dispersed uniformly in that volume of water, the resulting distribution would be less than 1 ppm, “below what is considered a toxic level”. The surfactants were added to break-up and prevent big blobs of oil from forming so that more of them are accessible to oil-digesting bacteria and they don’t gum up feathers of birds and animal fur.
·        Natural oil leaks do occur in the seabed but they probably are only about 1% of the Deepwater spill.
·        A year after the initial spill, measurements showed that 99% of the water samples tested in the entire region including 1,000 square miles closes to the wellhead had no detectable oil residue or dispersant.  Tourism was severely affected with one estimate claiming a loss of 23 billion dollars over the ensuing three years. A year later, however, the Governor of Louisiana declared the “region reborn”.
·        The author believes that the President’s characterization of the disaster is hyperbole and overreaction to the spill was even more damaging.


CHAPTER 3: GLOBAL WARMING AND CLIMATE CHANGE

·        The level of carbon dioxide in the last century has increased by 40% due to human use of fossil fuels.  Carbon dioxide makes up 0.04% of the atmosphere.  Water is a more significant greenhouse gas but we have no control over the amount that evaporates from bodies of water. Methane is also an important greenhouse gas.  The oxygen, argon, and nitrogen in the atmosphere are transparent to infrared radiation.
·        Physical calculations estimate that the earth’s temperature would be below freezing if not for the presence of greenhouse gases.
·        In 2007, the IPCC reported that global temperature rose by 064 Celsius in the previous 50 years.  During those same years, the land temperature rose by 0.9 Celsius. Land temperatures rise in greater amount because heat concentrates near the surface of the land while heat spreads down to depths of 100 feet in the ocean. In the same report, the IPCC states that global warming has been happening since the 1800’s but the anthropogenic distribution is hard to determine as part of that earlier warming was due to changes in the sun’s intensity.
·        Despite the smallness of this temperature rise, scientists including the author are more concerned about greater warming occurring in the future.
·        The author’s group through the organization Berkeley Earth Surface Temperature project did extensive analysis of temperature data previously not included in the IPCC analysis and re-analysis of temperature (1.6 billion temperature measurements, 14 data sets, 38 stations), putting in measures to avoid data selection and correction bias and station quality bias and tested for urban heat bias. To the author’s surprise, they came up with the same temperature rise reported by the IPCC of 0.9 Celsius over land concluding that “none of the legitimate concerns of the skeptics had improperly biased the prior results” suggesting to the author that “those groups had been vigilant in their analysis and treated the potential biases with appropriate care”.
·        See Page 76 (ibook version) for the group’s plot of the average global temperature rise over land from 1800 to the present. Dips in the otherwise rising temperature plot attributed to volcanic eruptions correlated with ice core measurements of sulfate particles. There was a close agreement between the temperature rise curve and the carbon dioxide rise curve when smooth fitting was done with volcanic eruption data, better than the author’s attempt at using a parabola and other polynomial fit. “Our fit shows that one could ignore these (sunspot) cycles and get an excellent explanation of most of the data considering only carbon dioxide and volcanoes”.  The excellent precise fit between the temperature and CO2 curves “suggests that most – maybe all – of the warming of the past 250 years was caused by humans” according to the author.
·        Based on these results, the author offers the following prediction: if the CO2 concentration increases exponentially and the greenhouse gas effects increase logarithmically, then the warming should grow linearly: doubling the time interval, doubles the temperature rise.  For example, assuming exponential growth of CO2 concentration, by 2052, CO2 concentration would be doubled to 560 ppm.  The corresponding rise in land temperature is 1.6 Celsius. 40 years after 2052, there will be an additional 1.6 Celsius rise and so on every 40 years until the CO2 rise is mitigated.

·        The logarithmic dependence of the greenhouse effect on CO2 concentration stems from, according to the author, “the fact that most of the effect comes from the edges of the CO2 absorption lines which only broaden logarithmically”.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.