Skip to content

It’s Not Fracking to be Concerned with, It’s a Fugitive Hiding Along the Supply Chain that is A Clear-and- Present Danger!

April 22, 2015

Gas Leak CartoonIn his January 2015 State of the Union address, President Obama emphasized two goals: the critical need to limit greenhouse gas pollution, and support for domestic natural gas and oil production, as well as renewable energy sources. His administration is seeking a 40 percent to 45 percent reduction in methane leaks and emissions of other volatile organic compounds from oil and gas wells and supporting infrastructure.

In support of these goals, the Environmental Protection Agency’s (EPA) announced they intend to regulate methane emissions from the oil and gas sector directly, rather than relying on voluntary programs or regulating associated pollutants. The proposal would be first-ever direct regulation on methane.

This presentation will look at methane emissions – fugitive, venting and flaring –from the oil and natural gas system; its impact, sources, and remediation in the context of the regulatory landscape and economic incentives.

Many view natural gas as a transitional fuel, allowing continued dependence on fossil fuels yet reducing greenhouse gas (GHG) emissions compared to oil or coal over coming decades.

Development of “unconventional” gas employing horizontal drilling and hydraulic fracturing, aka fracing” technologies dispersed in shale is part of this vision, as the potential resource may be large, and in many regions conventional reserves using well-known vertical drilling techniques are becoming depleted.

The Department of Energy predicts that by 2035 total domestic production will grow by 20%, with unconventional gas providing 75% of the total. The greatest growth is predicted for shale gas, increasing from 16% of total production in 2009 to an expected 45% in 2035.

Although natural gas is promoted as a bridge fuel over the coming few decades, in part because of its presumed benefit for global warming compared to other fossil fuels, very little is known about the GHG footprint of natural gas emissions from the oil and gas industry.

While methane is valuable as a fuel, it is also a greenhouse gas at least 21 times, possibly as much as 32 times, more potent than carbon dioxide over a 100-year period, with even greater relative impacts over shorter periods. In late 2010, the U.S. Environmental Protection Agency issued a report concluding that fugitive emissions from the natural gas system from wellhead to burner may be far greater than previously thought.

1 lb. CH4  equals 21to 32  lbs. CO2

1 lb. CH4 equals 21 to 32 lbs. CO2

These fugitive emissions of methane are of particular concern. Methane is the major component of natural gas and a powerful greenhouse gas. As such, small leakages are important. Recent modeling indicates methane has an even greater global warming potential than previously believed, when the indirect effects of methane on atmospheric aerosols are considered.

Emissions are either fugitive, vented or flare related:
• Fugitive emissions are those that “leak” unintentionally from equipment such pumps, valves, flanges, or other equipment – air emissions from locations other than stacks, vents, chimneys, or other fixed locations designed for releasing emissions.
• Vented emissions are releases due to equipment design or operational procedures, such as from pneumatic device bleeds, blowdowns, incomplete combustion, or equipment venting.
• Flaring is a combustion process used in petroleum refineries, chemical plants, and natural gas processing plants as well as at oil or gas production sites having oil wells, gas wells, offshore oil and gas rigs and landfills. Natural gas emissions are due to incomplete combustion.

Despite all the talk about climate change, anthropogenic greenhouse gas emissions and methane’s potent gas global warming effect, it is surprising to discover, that as of today, the U.S. Environmental Protection agency exempts the Oil and Gas industry from direct controls of natural gas emission.

The global methane budget is poorly constrained, with multiple sources and sinks all having large uncertainties suggests fossil fuels may be a far larger source of atmospheric methane than generally considered.

The Natural Gas System
The natural gas system or value chain is a highly integrated system where gas is produced, processed, and delivered to consumers.

Gas industry flow chart

The complexity and extent of the system in the U.S. makes the problem of accurately identifying point sources of emissions and reducing those emissions more difficult. Additionally, each sector has different factors affecting where, when and how much are the CH4 emissions.

The cost of finding and repairing major emitter is one of the primary reasons; the EPA exempted the industry from controlling emissions under the Clean Air Act.

Field Production: In this initial stage of field production, wells are used to withdraw raw gas from underground formations. The oil and gas industry is an aggregate of 21 major production companies with another 6,000 or so, production companies of all sizes.  There are about 2.7 million wells in the United States; of which 900,000 are active and about 1.8 million are abandoned. Of the 900,000 wells, 400,000 produce oil and approximately 500,000 natural gas dispersed within 33 states.

Gathering is the system of pipes that collects gas from the wells for downstream processing.  While some of the needed processing can be accomplished at or near the wellhead, field processing, the complete processing of natural gas takes place at a processing plant, usually located in a natural gas producing region. The extracted natural gas is transported to these processing plants through a network of gathering pipelines, which are small-diameter, low pressure pipes. There exists about 200,000 miles of gathering pipe, typically 8-5/8” or less transporting natural gas at 500 psi. This is in conjunction with over 10,000 gathering stations, and 100,000 gathering compressors.

Processing. Natural gas used by consumers, is much different from the natural gas brought from underground up to the wellhead. Although the processing of natural gas is in many respects less complicated than the processing and refining of petroleum, it is equally as necessary before its use by end users. The natural gas used by consumers is composed almost entirely of methane. However, natural gas found at the wellhead, although still composed primarily of methane, is by no means as pure. Natural gas processing at any one of 580 processing plants in the U.S. consists of separating all of the various impurities, other hydrocarbons and fluids from the natural gas, to produce what is known as ‘pipeline grade natural gas that meets specified tariffs. Pipeline quality natural gas is 95-98 percent methane.

Transmission and Storage. Transmission, involves the delivery of natural gas from the wellhead and processing plant to city gate stations or industrial end users. Transmission occurs through a vast network of high-pressure pipelines. Natural gas storage falls within this sector. Natural gas is typically stored in depleted underground reservoirs, aquifers, and salt caverns.

The transmission sector includes about 320,000 miles of large diameter interstate and intrastate pipelines, between 24 and 48 inches in diameter. The pipes transport pressurized natural gas at 1,000 psi from any one of 1,800 compressor stations. The transportation system also includes about 400 underground storage facilities consisting of depleted underground reservoirs, aquifers, and salt caverns.

Within this network, there are more than 11,000 delivery points, 5,000 receipt points, and 1,400 interconnection points that provide for the transfer of natural gas throughout the United States. Twenty-nine hubs or market centers provide additional interconnections.

Distribution focuses on the delivery of natural gas from the major pipelines to the end users (e.g., residential, commercial and industrial). Distribution pipelines take the high-pressure gas from the transmission system at “city gate” stations, reduce the pressure and distribute the gas through primarily underground mains and service lines to individual end users.

There was over 2 million miles of distribution mains in 2011. The distribution sector is operated by 1,200 companies that serve 66 million residential, 5.3 million commercial, and 191,000 industrial customers as well as 1,700 natural gas-fired electricity power plants, throughout the United States.

The natural gas is periodically compressed to ensure pipeline flow, although local compressor stations are typically smaller than those used for interstate transportation. Because of the smaller volumes of natural gas to be moved, as well as the small-diameter pipe that is used, the pressure required to move natural gas through the distribution network is much lower than that found in the transmission pipelines.

While natural gas traveling through interstate pipelines may be compressed to as much as 1,500 pounds per square inch (psi), natural gas traveling through the distribution network requires as little as 3 psi of pressurization and is as low as ¼ psi at the customer’s meter.

Overall, it is no wonder that such a massive distributed system of as pipes, valves, pumps, compressors, connectors, and flanges is prone to leaks.
There are several tangible and intangible drivers forcing the reduction methane emissions. From a social perspective, there is climate change and public awareness that possibly the scientists were right and the warming effect of anthropogenic greenhouse gas emissions of CO2 and CH4 are having a negative impact on the world’s environment, right before our eyes.

Reducing methane emissions, as we will see, is a powerful way to take action on climate change; and putting the wasted methane to use can support local economies with a source of clean energy that generates revenue, spurs investment, improves safety, and leads to cleaner air. That is why in his Climate Action Plan of 2014, President Obama directed the Administration to develop a comprehensive, interagency strategy to cut methane emissions.

Categorizing the US Greenhouse Gas Inventory by type of Gas.
The critical question is: Given the current extent of U.S. natural gas production—and the fact that production is projected to expand by more than 50 percent in the coming decades—what is the baseline natural gas fugitive emissions and are we doing everything we can to ensure that emissions are as low as is technologically and economically feasible?

U.S. Greenhouse Gas Inventory 2011

U.S. Greenhouse Gas Inventory 2011

This pie chart, U.S. Greenhouse Gas Inventory, illustrates the relative contribution of direct greenhouse gases to total U.S. emissions in 2011, that is the US greenhouse gas inventory. The primary greenhouse gases in the US inventory are CO2 and CH4 contributing approximately 82% percent and 10% to total greenhouse gas emissions, respectively. U.S. EPA Draft Inventory of U.S. Greenhouse Gas Emissions and Sinks: 1990 to 2011.

Defining Atmospheric Methane by Industry
This pie chart, U.S. Methane Emissions by Source, provides a summary of the major contributors of CH4 emissions. The primary sources of atmospheric methane was natural gas systems, enteric fermentation associated with domestic livestock digestion of feedstock (occurring in the intestines), and decomposition of wastes in landfills, see pie chart on the right.

U.S. Methane Emissions by Source

U.S. Methane Emissions by Source

Natural gas systems are the largest anthropogenic source of CH4 emissions in the United States, accounting for approximately one-quarter of total CH4 emissions in 2011.

Enteric fermentation, the second largest anthropogenic source of CH4 emissions in the United States in 2011, enteric fermentation contributed about 23 percent to the total CH4 emissions, an increase of 3.5 percent since 1990. This increase in emissions from 1990 to 2011 in enteric generally follows the increasing trends in cattle populations.

Landfills, the third largest anthropogenic source of CH4 emissions in the United States, accounted for 17.5 percent of total CH4 emissions in 2011.

In 2011, CH4 emissions from coal mining were 23 percent of total CH4 emissions. This amount represents an overall decline of 24.8 percent from 1990 results from the mining of less gassy coal from underground mines and the increased use of CH4 collected from degasification systems.

Manure management methane emissions were about 9% of the total CH4 emissions. Manure management refers to the capture, storage, treatment, and utilization of animal manures.

Petroleum systems and wastewater management methane emission contributed about 5 and 3 percent, respectively, to the inventory of CH4 emissions.

Methane emissions from wastewater management are produced when municipal and industrial wastewater and their residual solid by-product (sludge) are handled under or subject to anaerobic conditions.

Other sources of methane – rice cultivation, abandoned underground coal mines, iron and steel production, field burning of agricultural resides and silicon carbide production  – also play a role to atmospheric methane emissions.

Natural gas system’s contribution of methane to US’s total greenhouse inventory is estimated to be about 2.4 percent; that is, total CH4 emissions of 10% multiplied by 24%, its contribution to the methane component of the greenhouse gas inventory.

EPA Reporting Rules for the O&G Industry
On December 30, 2010, the EPA is promulgated a regulation to require monitoring and reporting of greenhouse gas emissions from petroleum and natural gas systems. The GHG reporting rules are contained in Title 40 CFR part 98 – Mandatory Greenhouse Gas Reporting.

The action does not require control of methane emissions.

GHG reporting is for actual emissions, unlike air permits, which are for Potential To Emit (PTE) emissions. The rule requires a facility that has actual emissions of 25,000 metric tons or more of CO2e per year to submit an annual report of GHG in electronic format to the EPA.
Methane Emissions: Natural Gas System
Identifying the sectors within the natural gas system contributing to methane emissions;

Natual Gas Supply Chain Fugitive Leakage Diagram

Methane leak rates specified in the paper, “Methane Leaks from North American Natural Gas Systems.” Credit: Stanford University/Science

This figure from the U.S. Environmental Protection Agency shows the leakage estimates by major industry segments along the supply chain – from preproduction activities of drilling and hydraulic fracturing, followed by production, processing, transmission, distribution and end use at homes, buildings, factories and modes of transportation,

The EPA estimates the leakage rate, the amount of gas lost per unit of production, throughout the natural gas systems at about 1.5%. Of the 1.5% losses, drilling and fracturing contributed 13% to the total emissions – production another 27% – processing 13%, with remaining 47% coming from the transportation and distribution sectors.

Field Production activities account for 31 percent of CH4 emissions from natural gas systems. Emissions at natural gas well pads come from leaks, pumps, unloading liquids from wells, pneumatic devices, compressors, condensate tanks and dehydrators. Leaks can also come from leaks, pneumatic devices and compressors at gathering stations, which increase the pressure of the gas in the gathering pipeline.

Methane emissions at Oil well pads methane are lower than gas wells and again come from, leaks, pneumatic devices, and storage tanks.

However, flaring of associated gas at oil well pads is an additional source of CH4 emissions and account for the majority of the non-combustion CO2 emissions.

Processing plants account for 15 percent of CH4 emissions from natural gas systems. Fugitive CH4 emissions from compressor venting, including leaky compressor seals, are the primary emission source from this stage. The majority of non-combustion CO2 emissions come from acid gas removal units, which are designed to remove CO2 from natural gas.

CH4 emissions from the transmission and storage sector account for approximately 44 percent of emissions from natural gas systems, while CO2 emissions from transmission and storage account for less than 1 percent of the non-combustion CO2 emissions from natural gas systems. Compressor station facilities, which contain large reciprocating and turbine compressors, are used to move the gas throughout the United States transmission system. Fugitive CH4 emissions from these compressor stations and from metering and regulating stations account for the majority of the emissions from this stage.  Pneumatic devices and engine un-combusted exhaust are also sources of CH4 emissions from transmission facilities.

Natural gas is also injected and stored in underground formations, or liquefied and stored in above ground tanks, during periods of low demand (e.g., summer), and withdrawn, processed, and distributed during periods of high demand (e.g., winter).  Compressors and dehydrators are the primary contributors to emissions from these storage facilities.

Distribution system emissions, which account for approximately 20 percent of CH4 emissions from natural gas systems and less than 1 percent of non-combustion CO2 emissions, result mainly from fugitive emissions from city gate stations and pipelines, where the gas is measured and decompressed before it is put into final sales lines to the consumers.

An increased use of plastic piping, which has lower emissions than other pipe materials, has reduced emissions from this stage. Distribution system CH4 emissions in 2011 were 16 percent lower than 1990 levels.

While not shown, customer meter-sets were found to contribute approximately 5 percent, to annual emissions from equipment leaks. Emission form outdoor residential customer meter sets account for 96 percent of the annual fugitive emissions from customer meters, whereas commercial and industrial meter sets account of only 4 percent.

Point Source of Emissions
The Gas Research Institute (GRI) and the U.S. Environmental sponsored a program to quantify methane emissions form the gas industry, starting at the wellhead and ending immediately downstream of the customer’s meter. The major contributors to emissions from equipment leaks are components associated with compressors, which have unique design and operating characteristics and are subject to vibrational wear. Components represent mechanical joints, seals, and rotating surfaces, with in time tend to wear and develop leaks.  The largest emission point source is the compressor blowdown open-ended lines BD OEL, which allows the compressor to be depressurized for maintenance or when idle.

This bar chart aggregates emissions by component from data presented in the preceding table. By doing this, it is easier to identify the major point sources of emissions.   It is rather obvious that compressor blowdown operations far exceed all other component emissions.

Average Facility Emissions in Transmission Sector

Average Facility Emissions in Transmission Sector


CH4 Fugitive Emission Reduction Opportunities
A groundbreaking analysis commissioned Environmental Defense Fund and conducted by ICF International (ICF) shows that the U.S. oil and gas industry can significantly and cost-effectively reduce emissions of methane – using currently available technologies and operating practices.

The report concluded:
• There are real cost-effective solutions available today that can put natural gas on a safer path for communities and for the climate.
• Methane emissions from U.S. oil and gas are projected to increase 4.5% by 2018 as emissions from industry growth – particularly in oil production – outpace reductions from regulations already on the books.
• Industry could cut methane emissions by 40% below projected 2018 levels at an average annual cost of less than one cent on average per thousand cubic feet of produced natural gas by adopting available emissions-control technologies and operating practices. This would require a capital investment of $2.2 billion, which Oil & Gas Journal data shows to be less than 1% of annual industry capital expenditure.
• If the full economic value of recovered natural gas is taken into account, the 40% reduction is achievable while saving the U.S. economy and consumers over $100M per year.
• The most cost-effective methane reduction opportunities would create over $164M net savings for operators.
• Almost 90% of projected 2018 emissions will come from oil production and existing natural gas infrastructure.
• A number of solutions, particularly in the upstream of the oil and gas value chain, will have environmental co-benefits at no extra cost, by reducing emissions that can harm human health, like volatile organic compounds and hazardous air pollutants.

Methane Mitigation Technologies
Several technologies are currently available that can economically reduce fugitive and vented methane emissions. The nine most promising technologies include:
• Implementing Leak Detection and Repair Programs
• Rerouting “Blowdown” Open-Ended Lines
• Replacing Wet Seals with Dry Seals in Centrifugal Compressors
• Installing Vapor Recovery Units on Storage Tank
• Installing Plunger Lift Systems in Gas Wells
• Wet Seal Degassing Recovery System for Centrifugal Compressors
• Converting High‐Bleed Pneumatic Devices to Low‐Bleed
• Reciprocating Compressor Rod Packing Replacement.
• Convert Natural Gas‐Driven Chemical Pumps (Kimray Pumps)

EPA STAR Program
EPA’s Natural Gas STAR Program has been most successful program so far in mitigation emissions in the oil and gas natural gas system. STAR is a flexible, voluntary partnership that encourages oil and natural gas companies—both domestically and abroad—to adopt cost-effective technologies and practices that improve operational efficiency and reduce emissions of methane while benefiting the environment, industry and the government.

Currently, Gas STAR includes 109 domestic oil and gas partner companies from all sectors, representing about 50% of the U.S. natural gas industry and 18 international Partners.

Between 1993 when the program begin through 2012, the Natural Gas STAR Program reported over 1 trillion cubic feet of methane emissions reductions since the program began in 1993 by implementing approximately 150 cost-effective technologies and practices.

This Figure shows Domestic Natural Gas STAR Methane Emissions Reductions from its start in 1993 to 2012. Each year since 1993, Natural Gas STAR partners have reported on the emission reduction activities undertaken to create a permanent record of their voluntary activities.

Natural Gas STAR Program Emission Reductions, Annual and Cumulative

Natural Gas STAR Program Emission Reductions, Annual and Cumulative

In 2012, Natural Gas STAR and Natural Gas STAR U.S. partners reported over 66 billion cubic feet (Bcf) in methane emission reductions by implementing nearly 50 technologies.

These methane emissions reductions had cross-cutting benefits on domestic energy supply, industrial efficiency, revenue generation, and greenhouse gas emissions reductions.

Alone, the 2012 emission reductions are equivalent to:
• The additional revenue of more than $264 million in natural gas sales (assumes an average natural gas price of $4.00 per thousand cubic feet).
• The avoidance of 26.7 million tonnes CO2 equivalent.
• The carbon sequestered annually by 5.7 million acres of pine or fir forests.

Adding to the success reported under the domestic Program, progress was also made in reducing global methane emissions through Natural Gas STAR International. International partners reported 7.6 Bcf in methane emissions reductions for a total of 98 Bcf since the inception of Natural Gas STAR International Program in 2006.

This figure shows the 2012 methane emission reductions breakdown by each sector – Production, which includes Gathering and Processing, and Transmission, and Distribution.

2012 Natural Gas Emission Reductions by Sector

2012 Natural Gas Emission Reductions by Sector

As in past years, the oil and gas production sector reported the largest reductions, accounting for 82 percent or 54 Bcf of the total reductions. The Transmission sector followed at 15% for 10 Bcf of the total methane emissions reductions.

Methane emissions reductions in 2012 occurred though the implementation of nearly 50 technologies and practices

Examples of technologies and practices used to reduce methane emissions included:
• Perform reduced emissions completions
• Use Directed Inspection and Maintenance (DI&M)
• Install flash tank separators on glycol dehydrators
• Use pipeline pumpdown techniques to lower pressure
• Implement a third-party damage prevention programs

These proven technologies and practices reduce methane emissions that would normally escape to the air from wells, storage tanks, and other equipment. These reductions result in significant environmental benefit by reducing methane, a potent greenhouse gas (GHGs), as well as reducing volatile organic compound (VOC) emissions, a precursor to ground-level ozone pollution.

In summary, EPA could reduce the sector’s methane pollution in half in a just few years by issuing nationwide methane standards that require common sense, low-cost pollution controls for the sector’s top emitting sources:

The methane abatement potentials are conservative estimates based on government inventories. They don’t account for the research indicating that actual emissions could be twice the inventory estimates, or higher. The problem and the upsides of controlling it—are likely much greater.

Regular leak detection and repair programs can reduce methane pollution by an estimated 1,700,000 to 1,800,000 metric tons per annum (MMTA).

Cleaning up older equipment—compressors and gas-driven pneumatic equipment—with proven technologies and practices can reduce methane pollution by an estimated 1,200,000 to 1,350,000 metric tons per annum.

The cost of the recommended standards would be low—less than one percent of the industry’s sales revenue.

The impacts of various drivers on reducing methane emissions from voluntary measures by oil and gas production are already positive. For example, regulatory and social mandate to reduce emissions are driving oil and gas operators to modify traditional operating practices by reducing natural gas venting during oil and gas production.

Technology developments are allowing operators to implement changes, capture, and sell more natural gas.

Greenhouse Gas Emissions and Natural Gas: The EPA and the Forgotten Source – The Oil and Gas Industry!

April 20, 2015

TBD 3 D Final Rendering -(Sq) 12Please join me on Thursday, June 4, 2015 at 1:00 EDT, for my live webinar “Greenhouse Gas Emissions and Natural Gas: The EPA and the Forgotten Source – The Oil and Gas Industry!

The webinar will look at on reducing methane emissions – fugitive, venting and flaring – from the natural gas system not only to be good environmental stewards but also to reap substantial profits.

In his January 2015 State of the Union address, President Obama emphasized two goals: the critical need to limit greenhouse gas pollution, and support for domestic natural gas and oil production, as well as renewable energy sources. His administration is seeking a 40 percent to 45 percent reduction in methane leaks and emissions of other volatile organic compounds from oil and gas wells and supporting infrastructure.

In support of both goals, the Environmental Protection Agency’s (EPA) announced they intend to regulate methane emissions from the oil and gas sector directly, rather than relying on voluntary programs or regulating associated pollutants. The proposal would be first-ever direct regulation on methane as part of an Obama administration strategy expected to curb methane emissions by as much as 45 percent by 2025.

My intent is to make this Webinar as informative as possible and make you aware that:

  • there is a problem with methane emissions from oil and gas production, gathering and processing, transmission, and distribution;
  • the problem is solvable and beneficial to stakeholders,
  • and that government can work with the industry to achieve mutually beneficial results without excessive rules and regulations,
  • while balancing the seemingly contradictory needs between people, planet and profits.

We will lay this foundation by:

  • Demystifying the natural gas system
  • Understanding the drivers to reduce U.S. Methane emissions
  • Unraveling the current state of the U.S. Greenhouse gas inventory
  • Reviewing the regulatory side of the equation
  • Identifying several barriers inhibiting quantification of fugitive emissions
  • Developing estimates of methane leakage from the natural gas system
  • Highlighting the CH4 fugitive emission reduction opportunities
  • Comparing reduction opportunities against marginal abatement costs
  • Outlining several methane mitigation technologies
  • Explaining the leak detection and repair (LDAR) process and mechanics
  • Conducting case studies and economic analysis of several primary improvement practices and technologies
  • Suggesting some enabling trends that are not advancements in technology and practices,
  • and finishing with a summary and some closing comments.

Discussions of Hydraulic Fracturing Should be Based In Reality?

February 25, 2015

To Kill a Mockingbird QuoteMr. Thomas R. Muth, Materials Science & Technology Division, Oak Ridge National Laboratory sent me an article, “Discussions Of Nuclear Power Should Be Based In Reality,” by Theodore Rockwell published by “The Scientist” on March 16, 1998. The piece is all about the importance of putting topics into the proper context with real-world meanings and the pitfalls of expanding a hypothetical conjecture into a real-world problem.

The article applies as much to the hazards of hydraulic fracturing as it does to the dangers of nuclear power plants. To read the entire article, go to:

The great scientist-philosopher Sir Arthur Eddington wrote that his words about “the soulless dance of bloodless electrons” might be truth, but they were not reality. He urged us to get away from theoretical speculations periodically and watch a sunset. Speculation is our business, but when people ask us about a technical matter, they deserve an answer that has real-world meaning, not a hypothetical argument.

For example, one-day consumer activist Ralph Nader was debating radiation pioneer Ralph Lapp. Nader stated that a pound of plutonium could kill every human being on Earth. One could picture a one-pint jar of the stuff spilling on the ground and its deadly vapors spreading until all life was obliterated. That’s what Nader’s statement means in the common-sense real world. But Lapp put the statement in its proper context by replying: “So could a pound of fresh air, Ralph.” Now how can that be? We’ve been repeatedly told that plutonium is the deadliest substance known. And we know that fresh air is literally the breath of life. What’s going on here? Nader’s statement was not actually a lie; he was just trying to make us think that a hypothetical conjecture was a real-world problem. He’s saying that the lethal dose of plutonium is a five-billionth of a pound. It’s really several thousand times larger, but even if Nader were correct, the only way you could actually kill the world’s 5 billion people with just one pound would be to line them up and have a trained physician inject into each person just the toxic amount of plutonium-no more or there wouldn’t be enough to go around. It would have to be in a fine aerosol mist, or it wouldn’t be lethal, and it would have to go directly into the lung. Then we would have to wait several decades, protecting the individual from other life-threatening influences such as cars, smoking, and malnutrition, until he or she died of lung cancer, because plutonium poses no other health threat.

Nader’s statement is truth, of sorts, but it is not reality. In reality, atomic bomb tests have dispersed about six tons of fine plutonium mist into the air, enough to give each person in the world 1,000 cancers, and we’ve had some laboratory accidents and spills that contaminated people. But not a single case of plutonium-caused cancer has been found, despite diligent searching. (Incidentally, plutonium is not the deadliest substance known; there are pesticides we throw onto food crops by the ton that are more toxic, spoonful for spoonful.)

And what about Lapp’s statement? It is true in precisely the same way as Nader’s. If a tiny bubble of fresh air is injected in just the right way into the bloodstream, a fatal embolism will develop. The only difference from the plutonium case is that you wouldn’t have to wait decades for cancer to develop. We do not think of fresh air as deadly, lethal, or dangerous, and rightly so, although people have been killed by air bubbles in their blood. How dangerous is plutonium in the real world? The answer is: Not a single death has resulted from plutonium poisoning, although we’ve been handling it in tonnage lots for a couple of generations. A sheet of paper, or even a few feet of air, provides enough shielding from its radiation. That’s the difference between the world of the imagination and the real world we live in.

Since most nonscientists don’t flit so easily from the hypothetical world to the physical world, we should be clear when we do. When we talk about casualties, we should distinguish between real and hypothetical deaths. For example:
1. Persons who die of food poisoning are known by name and can be counted. They are real.
2. Persons who die from particulate air pollution are largely unknown individually, but their numbers can be estimated approximately by methods that are subject to peer evaluation. These victims are nameless and their number controversial, but they are probably real.
3. Deaths “predicted” from exposure to radiation levels less than natural radiation backgrounds are wholly hypothetical, since the premise on which such calculations are based is an administrative convenience, not a scientific model. The premise is that individually harmless doses of radioactivity in a population can be added up to “predict” illness and even deaths in that population-a notion that affronts both science and common sense.

These various kinds of victims should not be compared as if they were the same. We should not justify America’s 9,000 annual food-poisoning deaths and tens of thousands of air-particle deaths by claiming we have avoided hypothetical deaths that might result from irradiating the food or replacing coal-burning plants with nuclear. Scientists have expressed their concerns about global warming and particulate emission predictions but have been surprisingly reluctant to speak out on radiation questions. Why? We are told that we must choose between wrecking the planet by continuing to burn fossil fuels at current rates or wrecking the economy by drastically reducing our energy usage. We don’t even discuss the option of using nuclear power to produce as much energy as needed without creating pollution or economic disruption. Nuclear power has been reliably and safely generating 22 percent of the United States’ electricity for a full generation. But we ignore fission and talk about untried hopes such as fusion, solar power, and undefined “renewables.”

We decide not to build another nuclear power plant because “we haven’t solved the waste problem.” How many people do we save by not adding to the nuclear waste? None. No one has ever been hurt by nuclear waste in the U.S., and no one is ever likely to be. We should treat radioactive waste just as we do selenium, arsenic, cadmium, mercury, barium, and other toxic materials whose half-lives are infinite. With such toxins we have ample experience that simple, common-sense waste disposal practices are fully adequate.

Another notorious hypothetical scenario is the dreaded nuclear reactor meltdown and the subsequent China syndrome, in which the molten core melts into the Earth on its way to China. (We’re talking about the only kind of reactors built in the West and in the Pacific Rim. The Chernobyl reactor is a different story-not as bad as you’ve heard, but not relevant here.) To get radioactive clouds and evacuation plans and all the other aspects of a nuclear emergency, we had to dream up a situation that would get all of the water out of the reactor vessel fast; otherwise, the reactor will not melt. In the laboratory of the mind, that’s easy. We came up with the “guillotine break,” a magical, instantaneous shearing of the heavy-duty main coolant piping. But even that is not enough, because the water can’t escape rapidly unless the sheared pipe ends move out of the way of each other quickly so that the water can flash unimpeded into clear space. No problem-the mind can move the pipe ends instantaneously, even though the pipe walls are more than an inch thick and made of high-grade stainless steel.

Other scenarios spring up like mushrooms. To study how radioactive clouds disperse under the worst possible weather conditions, we imagine a hierarchy of fantastic scenarios. This requires us to put a network of radiation monitors around each nuclear plant. And we put more engineering hours into calculating the impact of severe earthquakes than we used to use for the whole plant design. And we set up elaborate security provisions. And every component and safety system is backed up with backup systems. And we put the whole thing inside a steel-reinforced, leak-tight containment structure. And we prepare emergency procedures involving local, regional, and national police and fire and emergency organizations, and we run periodic drills. And then we turn to the public and say: “How about that! Are we safe or what?” And the public says, “Gosh, they must really be scared of this stuff.” And who could blame them?

The public didn’t know we were just playing games-serious games, legitimate games, but hypothetical speculations, not reality. What does the real world say about nuclear safety? Quite a bit, actually. Experiments and theoretical studies have been made, and we had the real thing at Three Mile Island in 1979. Nearly half the core melted down, and tons of the molten stuff fell down onto the bottom of the pressure vessel. That is the start of the China syndrome scenario. But in fact the core penetrated only a small fraction of an inch into the thick vessel wall and stopped. Negligible radioactivity was released; the nearest residents got about as much radiation from the accident overall as they get each day from the natural radiation background (having nothing to do with the nuclear plant). No one was hurt, not even the operators. When I pressed a Nuclear Regulatory Commission official as to why this was not more nearly the model for a major reactor accident, rather than various theoretical speculations, he looked shocked and said: “If I really thought that, I’d have to ask what I’m doing here!” I assured him he should ask exactly that, as we all should.

So, after 40 years’ experience and running more than 100 U.S. nuclear power plants (plus twice that many in the Navy), plus hundreds more in other countries, the Three Mile Island accident is the worst the real world can offer: nobody hurt, no environmental damage. Yet we proceed as if the speculations were real. The game is now costing hundreds of billions of dollars: making multimillion-dollar studies; “decontaminating” land that is already harmless; designing shipping casks with yet another layer of protective shield although the radioactive cargo they contain poses less of a public hazard than the diesel fuel in the truck that carries it. And spending $13 billion to dig a hole in Yucca Mountain in California to hold some shielded casks of spent fuel and nuclear wastes.

On June 3, 1997, the Department of Energy issued a report “after six years of study and analysis,” predicting that 23 people will be irradiated to death as a result of shipping shielded casks of radioactive waste from the weapons program (not civilian waste). Let me tell you how this works. As a truck with a shielded cask drives by, a government official says to a bystander: “Congratulations, sir. You are the millionth bystander.” The puzzled fellow asks. “What do I get?”

“You get to die,” replies the official. “This cask has been emitting radiation at one-millionth the lethal level. We have now passed a million bystanders and no one has died, so it’s up to you.”

“But I got only one-millionth of a lethal dose, right?” he asks. “And that can’t hurt me, right?”

“Correct, sir. But we have delivered a lethal dose overall, to the whole population of bystanders. I don’t expect you to understand it. Just be assured that these calculations have been peer-reviewed by scientists. You can count on them.”

“Tell me this is just a game,” the poor chap moans.

Do you doubt it?

PS: I lived with my wife and newborn son near the Three Mile Island nuclear disaster. Though we evacuated the area that Friday night when the non-condensable hydrogen bubble in the reactor vessel was about to breakthrough, I returned Monday morning. Measurements of the radiation level taken at the RCA Picture Tube Engineering, facility in Lancaster, PA with calibrated Geiger counters showed a background radiation level lower than that taken after China detonated their first nuclear weapon. Also, no one in my family glows in the dark!

Are Gun Control Advocates Going About It All Wrong?

February 14, 2015

Most interpret the Second Amendment on what it says, not on the grounds of what it does not say.

The Second Amendment of the United States Constitution, ratified as part of the Bill of Rights on December 15, 1791, the Bill of Rights, reads: “A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.”

According to Chuck Dougherty, The Minutemen, the National Guard and the Private Militia Movement: Will the Real Militia Please Stand Up? , 28 John Marshall Law Review 959, 962-970 (Summer 1995), “The colonists (arriving in America during the seventeenth century) drew from their knowledge of the militia system in England to develop their own military forces. The resulting colonial militia laws required every able-bodied male citizen to participate and to provide his own arms.”

Eighteenth-century thinking viewed Militia as that comprised of all able-bodied civilians eligible by law for military service to supplement a regular army in an emergency. The individual right to bear arms was legalized, defined and reinforced by the Militia Acts of 1792 and 1795.

Congress passed the Militia Act of 1795 before the Militia Act of 1792 expired, which was to remain in force, during the term of two years, and to the end of the next session of Congress. The Act of 1795 mirrored the provisions of the 1792 and provided federal standards for the organization of the Militia. This Militia Act was in turn amended by the Militia Act of 1862, which allowed African-Americans to serve in the militias of the United States. It was superseded by the Militia Act of 1903, which established the United States National Guard as the chief body of organized military reserves in the United States.

The federal standards enacted by the Act of 1795 stipulated: “Each and every free able-bodied white male citizen of the respective States, resident therein, who is or shall be of age of eighteen years, and under the age of forty-five years (except as is herein after excepted) shall severally and respectively be enrolled in the militia…..”

Scholarly commentaries and Supreme Court decisions on the meaning of “well regulated militia,” “the right of the People” and “keep and bear arms” began soon after approval of the Bill of Rights with the 1803 publication Sir William Blackstone‘s Commentaries on the Laws of England by constitutional theorist St. George Tucker and the 1820 Supreme Court case of Houston v. Moore.

The founding fathers interpretation of the right to bear arms granted to U.S. male citizens between the ages eighteen and forty-five therefore included:

  • Felons,
  • Prison Inmates,
  • Sexual Predators,
  • Mentally Insane, and
  • Terrorists.

Women and those of each gender under 18 and over 45 were expressly prohibited from bearing arms.

In closing, gun control advocates should not call for limitations on the right to bear arms rather they should protest for inclusion of those who present a danger to society on the grounds that the founding fathers did not explicitly exclude these groups from the Second Amendment. Women, it looks like the Amendment is not in your favor in owning guns.

Postscript: The author at 65 is an avid believer of strict gun control measures, holds a concealed handgun license and trained in the art of handling and discharging weapons by the NRA since 12-years of age.

Life Is Beautiful – Randy Pausch’s Inspirational “Last Lecture” at Carnegie Mellon University

January 2, 2015

Originally posted on BarryOnEnergy:

Today’s discussion is not on energy, nor is it controversial, political, technical or global. At times it becomes necessary to step back from the concerns of the modern world and take a more personal and introspective view on life. Winston Churchill once said:

“We make a living by what we get, but we make a life by what we give.”


Such is the case with Randy Pausch’s inspirational “Last Lecture,” which captivated the world and inspired millions. On one hand, Randy’s reason for giving the “Last Lecture” may seem tragic at best, but on the other hand, he left us and the world with a heartwarming and upbeat lecture that serves as a roadmap on how to live life to its fullest and what really matters. The lecture should be listened to by all. After hearings Randy’s speech, it can be said the he “made a life and gave…

View original 529 more words

Season’s Greetings

December 17, 2014

Seasons_GreetingsDear Friends and Colleagues,

Wishing you and your family a joyful Holiday Season and an exciting New Year.

With warmest personal regards,

Barry Stevens, Ph.D.
TBD America, Inc.

Who Says Antarctica is Gaining Ice – A Review?

December 5, 2014

iceberg_melting2Like most aspects of climate change, a wide diversity of opinion exists whether the Arctic and Antarctica are gaining or losing ice – a collision between science and belief. In an almost futile attempt to close this chasm, this review presents both sides of the equation. The answer to this dilemma lays in the parable of the blind men and an elephant; “that one’s subjective experience can be true, but that such experience is inherently limited by its failure to account for other truths or a totality of truth.”

To understand this vicious circle, we need to look no further than the top of the globe – the Arctic.

In September 2013, reported that “Arctic sea ice up 60 percent in 2013.”The article states, “The surge in Arctic ice is a dramatic change from last year’s record-setting lows, which fueled dire predictions of an imminent ice-free summer.”  A few days later, Nature World News headlined “Arctic Sea Ice Extent Up 50 Percent from Last Year, but Still 6th Lowest on Record.” The article states, “We had cool conditions, cooler than the long-term average, and yet it is still going to be the sixth-lowest ice minimum on record. However, this year’s ice gain is not of iceberg proportion. Much of the added ice is thin and slushy, which is in line with a general trend towards thinning ice in the Arctic.

Then In November 2014, Nature World News featured a piece “Arctic Sea Ice May Completely Disappear in Our Lifetime.” The researchers featured in this article claim “According to the study, an ice-free period hasn’t been seen in the Arctic Ocean for 2.6 million years, a time when the Earth’s climate was warming up. But now the climate is changing again and global temperatures are increasing, impacting Arctic ice once more. By the end of the present century, researchers say, the Arctic Ocean may very well be completely free of sea ice, especially in summer.” With all this seemingly contradictory statements, it is not too surprising that the question of ice growth or melt is froth with much confusion.

Moving south of the equator, Antarctica is embroiled with a similar quandary. Here, stronger feelings exist that ice is expanding rather than melting in much of Antarctica.

In, “Why is Antarctic sea ice at record levels despite global warming?” The Guardian (October 2014) pointed out “While Arctic sea ice continues to decline, Antarctic levels are confounding the world’s most trusted climate models with record highs for the third year running. The US National Snow and Ice Data Centre records show that Antarctica’s sea ice in 2014 was 1.54m sq. km above the 1981-2010 average. The past three winters have all produced record levels of ice (Figure 1).”

Figure 1: Average Monthly Antarctica Sea Ice Extent – September 1979 to 2014

Antartica Ice Melting 1979 to 2014

Then a few months later, NASA released a review, “West Antarctic Melt Rate Has Tripled” (NASA-UC Irvine December 2014). According to scientists at the University of California, Irvine, and NASA, “A comprehensive, 21-year analysis of the fastest-melting region of Antarctica has found that the melt rate of glaciers there has tripled during the last decade. The glaciers in the Amundsen Sea Embayment in West Antarctica are hemorrhaging ice faster than any other part of Antarctica and are the most significant Antarctic contributors to sea level rise.

Finally, to bridge the gap, Skeptical Science explains, “Skeptic arguments that Antarctica is gaining ice frequently hinge on an error of omission, namely ignoring the difference between land ice and sea ice.”

“In glaciology and particularly with respect to Antarctic ice, not all things are created equal. Let us consider the following differences. Antarctic land ice is the ice, which has accumulated over thousands of years on the Antarctica landmass itself through snowfall. This land ice therefore is actually stored ocean water that once fell as precipitation. Sea ice in Antarctica is quite different as it is ice which forms in salt water primarily during the winter months. When land ice melts and flows into the oceans global sea levels rise on average; when sea ice melts sea levels do not change measurably.”

“In Antarctica, sea ice grows quite extensively during winter but nearly completely melts away during the summer (Figure 2). That is where the important difference between Antarctic and Arctic sea ice exists as much of the Arctic’s sea ice lasts all the year round. During the winter months it increases and before decreasing during the summer months, but an ice cover does in fact remain in the North which includes quite a bit of ice from previous years (Figure 2). Essentially Arctic sea ice is more important for the earth’s energy balance because when it increasingly melts, more sunlight is absorbed by the oceans whereas Antarctic sea ice normally melts each summer leaving the earth’s energy balance largely unchanged.

Figure 2: Coverage of sea ice in both the Arctic (Top) and Antarctica (Bottom) for both summer minimums and winter maximums


Source: National Snow and Ice Data Center

“One must also be careful how you interpret trends in Antarctic sea ice. Currently this ice is increasing overall and has been for years but is this the smoking gun against climate change? Not quite” Antarctic sea ice is gaining because of many different reasons but the most accepted recent explanations are:”

“i) Ozone levels over Antarctica have dropped causing stratospheric cooling and increasing winds which lead to more areas of open water that can be frozen,and

ii) The Southern Ocean is freshening because of increased rain and snowfall as well as an increase in meltwater coming from the edges of Antarctica’s land ice.”

“Together, these change the composition of the different layers in the ocean there causing less mixing between warm and cold layers and thus less melted sea and coastal land ice.”

“All the sea ice talk aside, it is quite clear that really when it comes to Antarctic ice and sea levels, sea ice is not the most important thing to measure. In Antarctica, the largest and most important ice mass is the land ice of the West Antarctic and East Antarctic ice sheets.”

“There is variation between regions within Antarctica, with the West Antarctic Ice Sheet and the Antarctic Peninsula Ice Sheet losing ice mass, and with an increasing rate. The East Antarctic Ice Sheet is growing slightly over this period but not enough to offset the other losses. There are of course uncertainties in the estimation methods but independent data from multiple measurement techniques (explained here) all show the same thing, Antarctica is losing land ice as a whole, and these losses are accelerating quickly.”

In closing, it is doubtful that this review will change anyone’s opinion about climate change and its impact on the earth; especially those who deny that global warming is an ever-increasing problem influenced by human behavior. Like the blind men and an elephant, deniers who are limited by its failure to account for other truths, can’t change the irrefutable fact that lays in the totality of truth.”

Deniers need to get over the notion that there is a worldwide global warming conspiracy of scientists, universities, United Nations, nations, government agencies, and corporations. The only conspiracy is those who base their opinion on how their bunion feels.


Get every new post delivered to your Inbox.

Join 296 other followers

%d bloggers like this: