Thursday, December 29, 2011

Thermal Radiation Barriers


During the dog days of summer when the temperature passes 100 degrees Fahrenheit here in Virginia my split system heat pump struggles to try to cool the master bedroom. I have both draperies and window films, have retrofitted insulation in the attic, sealed the ducts, regularly service the heat pump and blower, but still the best my system can do on those hot days is 78 degrees Fahrenheit in the southern facing master bedroom, though other rooms are several degrees cooler. Before I consider equipment solutions and additional ducts, which will have to wait until the current system serves its useful life, I have been looking at radiant barriers and interior radiation control coatings as a possible method to shave a couple of degrees off the maximum daily temperature.

Radiant barriers and radiation control coatings have low Emittance, typically below 0.25. Infrared Emittance is measured between 0 and 1 with highly polished stainless steel at less than 0.1 and wood and sheetrock approaching 0.8-0.9. Infrared Emittance measures the ability of a warm or hot material to shed some of its heat in the form of infrared radiation. A material with an emittance of 1.0 emits about 3.4 watts per square meter, for each degree F above ambient temperature. Radiant barriers are designed to work in your attic to prevent some of the heat from the roof from being transferred into the attic space. The idea is to have the radiant barrier or coating not allow all the heat from the roof to move into the attic space. Oak Ridge National Laboratory, ORNL, found in field experiments that radiant barriers installed in the attic could reduce air conditioning bills in the hottest parts of the country. For homes that had both air-conditioning ductwork in the attic and were located in the Deep South, radiant barriers were found to reduce utility bills by as much as $150 per year using average residential electricity prices (for the late 1990’s) and an average size house with a single peaked rectangular rood. For more moderate summers, like those in Atlanta and Baltimore, annual energy savings were about half those of their southern neighbors. In the northern climate zones, the savings drops further, going from about $40 to $10 per year as you go from Chicago to Fairbanks, Alaska.

If there were no ducts or air handlers in the attic, the savings were found to be much less, and a radiant barrier may not be worthwhile from a cost benefit basis, but ORNL states that a radiant barrier may still help to improve comfort and to reduce the peak air-conditioning load on occasion. In northern climates where winter heating is the largest cost, radiant barriers can potentially reduce indoor heat losses through the ceiling during winter nights, but they may also reduce beneficial daytime heat gains due to solar heating of the roof. ORNL had no data measuring the heating benefits, but climate, orientation of the home, level of attic insulation, number of winter sunny days and other factors, can determine if the net winter effect of a radiant barrier to be positive or negative. The measured and tested benefit field studies were performed with air conditioning. It is to be noted that the field testing showed that the radiant barriers produce less energy savings when used in combination with high levels of insulation since the fraction of cooling load that comes from the ceiling is larger when the amount of insulation is small.

ORNL’s field testing showed that a new application of a radiant barrier on the attic floor, does work better than applying the radiant barrier to the roof rafters. Most of the field tests have been done with clean radiant barriers, and laboratory measurements have shown that dust on the surface of aluminum foil increases the emittance and decreases the reflectivity. This means that dust or other particles on the exposed surface of a radiant barrier will reduce its effectiveness. Radiant barriers installed in locations that collect dust or other surface contaminants will have decreasing performance over time. Though initially better, the attic floor accumulates dust resulting in the radiant barrier losing its effectiveness. Predictive modeling results, based on the ORNL testing, indicate that a dusty attic floor application will lose about half of its effectiveness after about one to ten years. Applying the radiant barrier to the floor of the attic is also not effective when a large part of the attic is used for storage, since the radiant barrier surface must be exposed to the attic space to work and applying a radiant barrier with the reflective surface touching the insulation is not effective.

In addition a radiant barrier installed on the attic floor directly on top of insulation can create a condensation, moisture and ultimately a mold problem. During cold weather, water vapor from the interior of a house moves into the attic through bathroom and kitchen vents and other openings. In most cases, this water vapor is not a problem because attic ventilation allows the vapor to dissipate. But, during cold weather, a radiant barrier on top of the insulation could cause water vapor to condense and even freeze on the barrier's underside. A radiant barrier used in the attic floor application must allow water vapor to pass through it. Some allow water vapor passage through holes or perforations, while others are naturally permeable.

Due to the above factors it is usual to install a radiant barrier to the interior of the attic roof, but that installation may cause other problems. The testing showed that radiant barriers can cause an increase in roof temperatures. Roof mounted radiant barriers may increase shingle temperatures by 2 to 10 degrees F. Radiant barriers on the attic floor may cause smaller increases of 2 degrees F or less. The effects of these increased temperatures on roof life, if any, are not known, but should be considered with asphalt shingles. Attic ventilation helps to cool your attic in the summer and to remove excess water vapor in winter and should not be blocked by a radiant barrier. After installing a radiant barrier always check that existing ridge vent systems are not blocked by a radiant barrier and there is free flow of air, check the soffit vents to ensure that they have not been covered with insulation or the barrier, and check gable vents to make sure that they have not been blocked.

The attic is a system consisting of many components that work together. Radiant barriers are only a small element and possibly the least important. The radiant barriers reduce radiant energy transfer. Insulation on the attic floor reduces conductive and convective heat transfer. The duct insulation reduces conductive and convective heat transfer at the duct surface. Duct sealing reduces the energy losses caused by increased air exchange between the inside and outside of your home. Attic ventilation in the gables, ridges or soffit can reduce the amount of energy that enters the attic from the outside. Overall, as you can see in the chart above, derived from the ORNL research, shows that the most energy savings come from having adequate insulation and sealed and insulated ducts in the attic, not the radiant barriers, and radiant barriers are most effective with less insulation in the air conditioned south. Nonetheless, that small savings might improve comfort on a very hot day. Finally if you install a radiant barrier make sure the product label indicates that emittance is less than 0.25 as measured by ASTM C1371 and the product is designed to work in your attic.

Monday, December 26, 2011

EPA Mercury Air Standards and Electrical Power in the United States


On Wednesday, December 21, 2011 the U.S. EPA released the final regulation for controlling mercury, and other toxic emissions from coal fired power plants. The Mercury and Air Toxics Standards (MATS) regulates mercury, arsenic, acid gas, nickel, selenium, and cyanide. The standards will slash emissions of these pollutants primarily from coal fired electrical generation plants. This should not be confused with the Cross-State Air Pollution Rule, which requires reductions of sulfur-dioxide and nitrogen-oxide emissions in 23 Eastern and Midwestern states beginning next year, as well as seasonal ozone reductions in 28 states. Combined these two rules will have a significant impact on the future cost and availability of electrical power in the United States and should be part of a careful and well thought out and communicated environmental and energy plan for the nation.

According to the EPA it will cost $9.6 billion annually to comply with the MATS regulations and Industry analysts believe that 10% to 20% of U.S. coal-fired generating capacity will be shut down by 2016. According to the EPA, the two rules together are estimated to prevent up to 46,000 premature deaths, 540,000 asthma attacks among children, 24,500 emergency room visits and hospital admissions. “The two programs are an investment in public health that will provide a total of up to $380 billion in return to American families in the form of longer, healthier lives and reduced health care costs. “The EPA did not give an estimated combined cost of the two rules; however, the Edison Electric Institute, an industry trade group, claims the combined new rules will cost utilities up to $129 billion and eliminate one-fifth of America's coal electrical generating capacity.

In 2010 coal was used to product 45 % of electricity while oil was used to generate less than 1% of electricity, so the MSTS rule is intended for coal plants. The nation's coal-fired power plants were built as the nation grew and industrialized in the first half of the 20th century when coal was the most abundant and cheapest available fuel. With the existing power plants in place coal is still much cheaper than natural gas for generating electricity, but the tightening of regulations by EPA under the Mercury and Air Toxics Standards and the Cross-State Air Pollution Rule (even with recent modifications) will decrease that financial advantage because coal burns dirtier than natural gas. In addition, the recent availability of shale gas has lowered the cost of natural gas and provided a potentially reliable supply.

These new regulations will require existing plants to meet emission standards that are at least as stringent as the top 12% best-performing coal facilities and may force some plants to convert to natural gas fuel or to shut down entirely. The generating capacity will have to be replaced with new plants that burn cleaner fuels and produce less pollution, but the cost of power will increase. Several state utility commissioners say they fear the agency's recent rules will push up electricity prices or could even hurt electric-system reliability if too many power plants are shut down. That is countered by the EPA who states that less than 1% of the national generating capacity will be lost.
According to EPA there are about 600 power plants covered by these standards. They emit harmful pollutants including mercury, non-mercury metallic toxics, acid gases, and organic air toxics including dioxin.

Our modern society requires power - that is not going to change. The cost of power is a key factor in determining the cost of production, and the cost of living. In the U.S. in 2010 over 90% of electrical power was produced by steam turbines powered by coal, oil, gas, and bio fuels. Wind and water may be used to spin the turbines as well. Coal produced 45 % of electricity, nuclear power generated 20% of the electricity used, natural gas generated 24 % the electricity used, hydroelectric generated 6%, wind 1% and oil, wood, biomass, geothermal solar and other generated the rest. The Mercury and Air Toxics Standards and the Cross-State Air Pollution Rule will reshape the industry reducing coal fired plants, but some fuel will need to be used to spin the turbines. In all probability natural gas will be substituted for coal. There will be economic impacts to the reduction in demand for coal in the United States, the cost to convert, replace and upgrade power plants, and increasing the demand for natural gas.

Natural gas is the cleanest of the fossil fuels. Burning natural gas in the place of coal emits fewer harmful pollutants, but methane, the principle component of natural gas, is itself a potent greenhouse gas. Methane has an ability to trap heat almost 21 times more effectively than carbon dioxide. This past year researchers at Carnegie Mellon University compared greenhouse gas emissions from the Marcellus Shale region with emissions from coal used for electricity generation. The authors found that natural gas from the Marcellus shale had lower life cycle greenhouse gas emissions than coal for production of electricity by 20–50% depending upon plant efficiencies and natural gas emissions variability. Shale sourced natural gas could provide a reliable source of natural gas for our nation in this century and might make the conversion of some power generation worthwhile. However, before we push a significant portion of our electrical generating capacity from coal to natural gas, we should ensure that we will have the natural gas supplies available at the time and location that it is needed to produce a reliable electrical grid.

Thursday, December 22, 2011

A Full Year of Solar Power- My Return on Investment This Year



How did I do with a full year with my solar photovoltaic panels? To purchase and install a 7.36 KW solar array consisting of 32 Sharp 230 watt solar panels, 32 Enphase micro-inverters and mounts was $57,040. For the engineering and permits I paid $1,500 for a grand total of $58,540 out of pocket. Now it gets complicated. The 7.36 KW are equivalent to 6.2 KW PTC. I reserved 6 KW PTC Renewable Energy Rebate from Virginia and on completion of installation, inspection by the county, and sign-off by my power company, NOVC, I filled out all my paperwork, provided copies of permits, signed off inspections, invoices, technical information, contractor information and pictures of the installation, and meter (before the 180 day deadline despite snow, rain and contractor problems), and promptly (within 4 weeks) received my renewable energy rebate of $12,000 from Virginia. This payment was not taxable income, but rather reduced the “cost basis” of the PV Solar system for federal tax purposes. Thus, from the original installation cost of $58,540 I subtract the Virginia Renewable Energy Rebate of $12,000 to obtain my net cost of $46,540 to apply the 30% and obtained a federal tax credit of $13,962. My total out of pocket cost for my solar system after the first year is $32,578. My energy production as tracked by Enphase was actually higher than the PV Watts the DOE model energy production at 9.7 megawatt hours for the year (there was a several weeks during the spring where my internet connection was spotty and the data from the solar panels was not consistently received by Enphase so my generation was probably a little higher). My savings on electricity is $1,200 per year, NOVEC, a cooperative, has very good residential rates. That is about a 4% return on my investment each year (unless NOVEC raises their rates). Without additional incentives my PV solar array would return about 4% a year.

The cost and return on a solar power system is largely based on regulated incentives and there are more. The final incentive is the Solar Renewable Energy Credit or SREC. Each SREC is a credit for each megawatt hours of electricity that is produced. SRECs have value only because some states have solar set asides from their Renewable Portfolio Standards, RPS, which require that a portion of energy produced by a utility be produced by renewable power. Utilities in those states buy SRECs from solar installation producers. It is a way for states to ensure that the upfront cost of solar power is recovered from utility companies (and ultimately from the rate paying consumers). Some states, like New Jersey and Maryland, require their utilities to buy SRECs only from residents of their states creating a closed market where the price can be very high until supply responds to that price. Other states, like Virginia, have no current solar RPS requirement and their RPS is voluntary. Still other states, like Pennsylvania allow their utilities to buy their RPS from any resident within the PJM regional transmission organization. The Pennsylvania SREC price has collapsed due to oversupply and a method of calculating the penalty fee, the Solar Alternative Compliance Payment, SACP, that is favorable to the utilities and ultimately the consumer.

There are estimated to be about 105 megawatts of solar capacity now in place in Pennsylvania, while the 2004 law requiring utilities to buy only 44 megawatts of solar renewable energy credits for the current year. The result: SREC prices have crashed within Pennsylvania. The solar industry says the market may remain oversupplied for several years unless the legislature steps in. I am fortunate that my SCREs are registered and were grandfathered in the (now closed) Washington, D.C. market when they accelerated their solar RPS. So, for the moment, I can still sell my SRECs at an attractive price. I expect that the Washington D.C. market price for SRECs will increase in the short run, then fall as the market supply over responds to the regulatory demand and the falling SACP. For the moment and probably for the next two to three years I expect favorable SREC prices in the near term with the Washington D.C. SACP set at $500 until 2016, and with regulatory demand slightly more than or near balance with supply for the moment. The DOE loan to Project Amp remains a market supply risk. Remember, the DOE recently approved a $1.4 billion loan guarantee to support Project Amp; the installation of 752 MW of photovoltaic solar panels on 750 existing rooftop owned by Prologis. This represents more than 80% of the total amount of PV installed in the U.S. last year when the renewable energy solar photovoltaic rebates were widely available. Depending on where these solar photovoltaic panels are installed and in what time frame they could significantly impact the solar market and change the SREC markets in several states.

Overall, the return on investment for my solar panels will be 4% based on the power they generate and the current cost of electricity from NOVC, who have not raised their rates in more than 5 years and returned some profits to their customers recently as rebates. As long as they are available I will continue to obtain additional profits from SRECs, but those returns are not guaranteed for the long term. This year I sold 8 SRECS for a net of $1,458 after fees but before taxes. So that my return on my solar panels was 8% for the year and slightly more than half the return is taxable income. Still this was the best investment we had this year.

Monday, December 19, 2011

Pennsylvania: Local Regulations, Mineral Rights and Fracking in Your Backyard


The first house we owned was just outside Pittsburgh, PA. We read all the paperwork and discovered that the property title was not fee simple, but rather the “mineral rights were excepted and reserved.” The wooded ravine behind the house bore all the signs of a surface coal mining operation that had not been reclaimed, but merely left to weed and fast growing trees. Coal mining had changed from manual pillar and room mining of my grandfather’s day to continuous long wall mining and surface mining using draglines. I worried about the someday when better mining techniques or a nation desperate for energy might come back for the coal which sat beneath my house. Not once did I think about coal gas, or fracking, but homeowners in Pennsylvania better be thinking about that now. Legislation before the General Assembly may bring fracking to your neighborhood.

In Pennsylvania, ownership of surface rights and ownership of minerals rights are often separated. In addition, mineral rights on the same tract may be separated from each other - oil, gas, coal, hard rock minerals, etc. may all be owned by separate companies. The mineral rights were usually separated before land was partitioned so that an individual or corporation may own the rights to an entire neighborhood. Pennsylvania does not maintain ownership records of mineral properties in a central location nor do they have property tax records for the mineral rights because they do not pay property taxes on those rights. Rather; county governments maintain the old transfer records that contain this. Surface deeds are recorded in the county's Recorder of Deeds office, but the mineral rights are not, unless the mineral rights have recently been sold and the sale taxed.

All surface and mineral owners have property rights under the law. Pennsylvania recognizes both the mineral owner's right to recover the mineral, and the landowner's right to protection from unreasonable encroachment or damage. Some towns have attempted to control hydraulic fracturing and shale gas processing through zoning. Now, Pennsylvania is considering legislation that would effectively remove oil and gas drilling and related gas processing activities from nearly all local land use regulation, including regulation under the Municipalities Planning Code. The legislation, Pennsylvania House Bill 1950, passed the House of the General Assembly on Nov. 17, 2011 by a vote of (107-76), was amended in the Senate and sent back to the house. This version is not expected to pass in the next two days, instead it is expected that a committee will be appointed to form a compromise bill.

This bill would amend Title 58 (Oil and Gas) of the Pennsylvania Consolidated Statutes, consolidating the Oil and Gas Act modifying the definitions, well permitting process, well location restrictions including increasing horizontal set backs from water supply wells, reporting requirements, bonding, enforcement orders, penalties, civil penalties and, restricting local ordinances relating to oil and gas operations; and taxing the gas. However, the most significant element would in effect take away from the towns the ability to use zoning to exclude shale gas production in residential neighborhoods. According to Richard A. Ward, Township Manager Robinson Township, PA, this bill would turn the entire state of Pennsylvania into one large industrial zone. No zoning could exclude fracking wells and shale gas processing.

While it would undoubtedly be beneficial for the industry and regulators to have a single set of statewide regulations for siting and drilling hydraulic fracturing wells, watershed characteristics and geology vary across the state. Furthermore, the health and welfare of communities are best protected by local zoning. There has not been enough data gathered and studied to know horizontal distance to a drinking water well and aquifer will guarantee safety for the water supply throughout the various localities in the state and the 500 feet proposed under HB 1950 has no basis in scientific fact. The proposed legislation seems in a rush, to generate taxes and jobs. The gas will still be there if we take the time to understand fracking adequately to be able to release the gas from the shale formations without significant damage to our water resources and communities.

Drilling requires large amounts of water to create a circulating mud that cools the bit and carries the rock cuttings out of the borehole. After drilling, the shale formation is then stimulated by hydro fracking, where on average 2-5 million gallons of chemicals and water are pumped into the shale formation at 9,000 pounds per square inch and literally cracks the shale or breaks open existing cracks and allows the trapped natural gas to flow. For gas to flow out of the shale, all of the water not absorbed by the formation during fracking must be recovered and disposed of. Though less than 0.5% by volume, the proprietary chemicals represent 15,000 gallons in the waste water recovered from the typical hydro fracking job. The chemicals serve to increases the viscosity of the water to a gel-like consistency so that it can carry the propping agent (typically sand) into the fractures to hold them open so that the gas can flow.

Drawing large quantities of water in a limited period of time could impact water supplies. Determining the proper methods for the safe disposal of the large quantities of this fracking fluid that may also contain contaminants from the geological formation including brines, heavy metals, radionuclides and organic contaminants and monitoring the impact from this disposal must also be done. Geologists and engineers believe that in hydraulic fracturing the intervening layers of rock prevent a fissure from extending into the water table. The problems seen in drinking water wells near hydro fracking jobs typically occur when fracking fluid seeps into drinking water wells through improperly sealed or abandoned drilling wells (a large number of the problems have occurred in older coal bed areas). Proper well construction and abandonment standards to protect watersheds need to be developed and enforced. The water that is absorbed into rock formations may change the formations and the hydraulic balance in ways we do not understand.

Finally, care must be taken to avoid degradation of watersheds and streams from the industry itself as large quantities of heavy equipment and supplies are moved on rural (and potentially residential) roads and placed on concrete pads. The picture above from the U.S. Geological Survey, USGS, shows the amount of equipment involved in a hydro frack. The watersheds must be monitored. Sampling should take place before fracking and at regular intervals after a hydro frack job. We need to proceed slowly to make sure that we are doing it right and protecting our water resources. We have only a small margin for error.

Thursday, December 15, 2011

Durban in the End



More than 10,000 ministers, officials, activists and scientists from 194 countries met in Durban in what has become an annual ritual. Durban, the 17th annual Conference of the Parties (COP17) to be held since the United Nations' first began to coordinate an attempt to control global warming, has concluded with little results. The Conference of the Parties’ stated goal is to prevent temperatures from increasing more than 2 degrees Celsius by the end of the century. They are trying to achieve this goal by reducing CO2 emissions through a treaty to expand and extend the Koyoto Protocol and to tax all the developed nations to pay for climate impacts on poorer nations through the Green Fund for climate assistance.

Since 1990 global CO2 emissions have gone from 21 billion tons of CO2 to 29 billion tons of CO2 in 2009 according to data from the International Energy Agency (IEA). Global emissions of CO2 increased 38% despite a 14.7% decrease below their 1990 level for the Kyoto Participants and the United States increased of only 6.7% above 1990 levels from 4.9 billion tons of CO2 to 5.2 billion tons of CO2. The bulk of the increase has come from China, Africa, Middle East, India and the rest of Asia. The United States and 35 Kyoto participants represent less than half the world CO2 emissions and that is shrinking every year.

In Durban, governments including China and the United States agreed to negotiate an “agreed outcome with legal force” as soon as possible, but not later than 2015 and this agreement is to take effect not later than 2020. Work will begin on this immediately under a new group called the Ad Hoc Working Group on the Durban Platform for Enhanced Action. In addition governments pledged to contribute start-up costs of the Green Climate Fund, to support developing nations, as agreed last year in Cancun, Mexico, and to have more meetings. UNFCCC Climate Change Conference, COP 18/ CMP 8, is to take place 26 November to 7 December 2012 in Qatar. They agreed to keep talking and negotiating, but the movement seems to have lost momentum. The US and Europe are barely relevant in the conversation, China, India, Latin America, and the Middle East are now the engines of growth in CO2 emissions.

The 35 nations participating in Kyoto agreed a second commitment period of the Kyoto Protocol beginning January 1, 2013. Parties to this second period will quantify emission limits or reductions and submit them for review by May 1, 2012. As expected, Canada promptly withdrew from the Kyoto Treaty. It remains to be seen if Russia and Japan will remain within the Kyoto Treaty. The participants in the Kyoto treaty now represent less than 24% of the global CO2 emissions. Without Japan and the Russian Federation they represent 15% of global emissions.

From 2001-2010 global temperatures have not increased, but remain approximately 1.13°F warmer than the average global surface temperature from 1951 to 1980. To measure climate change, scientists look at long-term trends. The temperature trend, including data from 2010, shows the climate has warmed by approximately 0.36°F per decade since the late 1970s. Carbon dioxide has shown a less direct relationship to global temperatures than the climate models had predicted which seems fortunate given the significant increase in world CO2 emissions in the past two decades.

At the Copenhagen meeting in 2010 President Obama pledged to reduce U.S greenhouse gas emissions to 17% below the 2005 levels by 2020, though it is unclear if this commitment is in any way binding. Due to the recent drop in industrial production and electricity usage, we have already cut U.S. emissions by 6% from 2005 levels; the Administration is well on its way to achieving this goal.

Monday, December 12, 2011

Environmental Impacts from Fracking

The oil and gas industries’ ability to pull, push or otherwise draw hydrocarbons from the earth has exceeded our knowledge of geology and groundwater and gotten ahead of our regulations which were created for traditional oil and gas wells. In the lingering Texas drought the oil and gas industry finds itself competing for the millions of gallons of water necessary to hydro fracture a well with other users -towns and ranches. In some areas ranchers are selling water rights while in other areas drillers are being limited in how much water they can draw from the aquifer. The portion of the water used to hydro fracture a well that can be recovered and reused for other fracking jobs is determined by flowback and how the water is treated or disposed of. Flowback from fracking is determined by geology and the amount of water absorbed by the rock formations, before the rest needs to flow out of the well to allow the released gas to flow.

In Texas groundwater is being used to frack wells at an unsustainable rate. In Texas groundwater belongs to the landowner and governed by the rule of capture, which grants landowners the right to capture the water beneath their property. The landowners have a right to pump and capture whatever water is available, regardless of the effects of that pumping on neighboring wells. Any single landowner in a watershed could in effect sell all the groundwater quite legally taking their neighbor’s water. Groundwater should not be used beyond its recharge rate or ultimately it will be depleted leaving communities without adequate water to support them.

The water that is absorbed into rock formations may change the formations in ways we do not understand. Though the water in the hydro frack is exempted from the clean water act (by a 2005 act of congress), the flowback is not and must be disposed under state regulations. The flowback water itself is a problem, it contains “proprietary” chemicals and contaminates from the geological formation. Researchers of the University of Texas at Austin were part of a team of researchers who studied a series of small earthquakes that struck near Dallas, Texas in 2008 and 2009, in an area where natural gas companies had used fracking. The epicenter of the quakes turned out to be about half a mile from a deep injection disposal well under the Dallas-Fort Worth International Airport used to dispose of the fracking fluid. The largest earthquake of the series measured 3.3 on the Richter scale, a very small earthquake. In a study that was published in the Bulletin of the Seismological Society of America, the researchers also reviewed records from US Geological Survey seismic-recording stations in Oklahoma and Dallas. It was concluded by the researchers that the fracking did not cause the earthquakes, but there seemed to be a relationship to the deep well injection of the fracking fluid to the earthquakes.

For years the US Geological Survey has been studying the factors that impact the response of groundwater wells to earthquakes, including the magnitude and depth of the earthquake, distance from the epicenter, and the type of rock that surrounds the groundwater. The depth of the well, whether the aquifer is confined or unconfined, and well construction also influence the degree of water-level fluctuations in wells in response to seismic waves. It has been suggested that some aquifers may even act as resonators, which may amplify the response. Even a small earthquake is not without consequences to the groundwater in the surrounding area. Fracking may also have impacts on nearby water wells. Water injected into a previously dry formation may act as a resonator or lubricant to allow the formation to slide.

Local geology determines the danger of fracking to the water table. In Wyoming where the water table is deep and the gas shallow the drinking water has been impacted. The Environmental Protection Agency, EPA, announced last Thursday that glycols, alcohols, methane and benzene found in a well the EPA drilled to the drinking water aquifer in Wyoming within the Pavillion field were consistent with gas production and hydraulic fracturing fluids and likely due to fracking. The oil company responsible for these wells claims that the results are inconclusive because methane can naturally seep into groundwater wells that provide drinking water. This is a rare occurrence that is usually confined to deeper water wells in the coal-producing areas, but these were deeper wells in a coal producing area. Benzene also occurs in nature, but I can find no instances where benzene was introduced into drinking water by purely natural action; though it could have been introduced into the water by previous generations of oil and gas development. Benzene, glycols and alcohols were all common substances in fracking fluids. In 2004 when EPA first looked at hydro fracking they coordinated a voluntary agreement with the three largest fracking contractors (Halliburton, BJ Services, and Schlumberger), to stop using diesel fuel in hydro fracking. Until 2004 diesel had been commonly used in hydro fracking coal bed gas and the deeper shale gas. Diesel contains benzene, so it seemed likely to the investigators they were introduced by fracking.

EPA constructed two deep monitoring wells to sample water in the aquifer, and tested private and public drinking water wells in the community. The samples were consistent with chemicals identified in earlier EPA results released in 2010 and are within established health and safety standards for most substances, but not for benzene. Sampling found up to 246 micrograms of benzene per liter of water was found in one well, far above the safe drinking water standard of 5 micrograms per liter. The geology of Pavillion, Wyoming is unusual for shale gas formations. The shale is much shallower than in the Haynesville shale and the Marcellus shale, though there is a shallow area of the Fayetteville shale. The fracturing in Wyoming took place both within the water table and a few hundred feet below the drinking water aquifer close to drinking water wells.

In hydraulic fracking on average 2-5 million gallons of chemicals and water are pumped into the shale formation at 9,000 pounds per square inch and literally cracks the shale or breaks open existing cracks and allows the trapped natural gas to flow. While geologists and engineers believe that in hydraulic fracturing the intervening layers of rock prevent a fissure from extending into the water table, they base this on the “typical” geology where there are thousands of feet between the water table and the fracking location and does not account for any potential impacts on the hydraulic balance in a watershed. The problems seen in drinking water wells near hydro fracking jobs typically occur when fracking fluid seeps into drinking water wells through improperly sealed or abandoned drilling wells (a large number of the problems have occurred in older coal bed areas). However, in Pavillion the groundwater is within a few hundred of the gas reserves the groundwater is more easily directly impacted by fracking. In addition, there had been previous development of the oil and gas resources opening the possibility for improperly abandoned or sealed wells. In Pavillion, Wyoming they used hydro fracking within the water table near the drinking water wells. It is not at all surprising that they contaminated the water supply. What is surprising is that the business and the regulator allowed this to happen. They did it without thinking about the potential consequences because it was legal.

The oil and gas industry has outpaced regulators, knowledge of the consequences and it seems common sense. It is essential to determine the vertical and horizontal separation that is necessary to protect the drinking water aquifers from fracking and what impact new rounds of hydraulic fracturing can have on previous developed areas with old abandoned wells before watersheds are destroyed. Then increase oversight to ensure that this separation is maintained (despite inevitable requests for waivers), improve well-design requirements and ensure their consistent implementation and require the appropriate treatment and recycling of drilling waste water. Use of waste water treatment plants that were designed to address biological solids to treat millions of gallons of water used for hydraulic fracturing or ponding the waste is short sighted and imprudent. The deep well injection commonly used in Texas may have consequences beyond small earthquakes.

Drilling requires large amounts of water to create a circulating mud that cools the bit and carries the rock cuttings out of the borehole. After drilling, the shale formation is then stimulated by hydro fracking, using 2-5 million gallons of water. For gas to flow out of the shale, all of the water not absorbed by the formation during fracking must be recovered and disposed of. Though less than 0.5% by volume, the proprietary chemicals represent 15,000 gallons in the waste water recovered from the typical hydro fracking job. The chemicals serve to increases the viscosity of the water to a gel-like consistency so that it can carry the propping agent (typically sand) into the fractures to hold them open so that the gas can flow. Determining the proper methods for the safe disposal of the large quantities of this fracking fluid that may also contain contaminants from the geological formation including brines, heavy metals, radionuclides and organic contaminants and monitoring the impact from this disposal must also be done. The impact of so much waste water on our water resources must be monitored and addressed.

While most states require drillers to dispose of fracking waste water in deep wells below impermeable rock layers, Pennsylvania that has no deep wells has allowed drillers to discharge their fracking waste water through sewage treatment plants into rivers. Sewage treatment plants are designed to separate solids and use bacteria to treat biological waste. They are not equipped to remove or neutralize the contaminants in used hydro fracking fluid. In 2009 and 2010, public sewage treatment plants in Pennsylvania directly upstream from drinking-water intake facilities accepted wastewater that contained radionuclides at levels hundred even thousands of times the drinking-water standard despite the fact that these plants (and most sewage plants) were exempt from monitoring for radiation. Local regulators and gas producers believed the waste was not a threat because it would be diluted by treatment in the sewage treatment plants or the river itself, without sampling. They guessed at the environmental impact.

Finally, care must be taken to avoid degradation of watersheds and streams from the industry itself as large quantities of heavy equipment and supplies are moved on rural roads and placed on concrete pads. The watersheds must be monitored. And recent incidents and reports of potential contamination of drinking water supplies from fracking, the waste water from the fracking process underscore the dangers.The New York Times brought to light a 1987 E.P.A. report to congress titled “Management of Wastes from the Exploration, Development and Production of Crude Oil, Natural Gas and Geothermal Energy.” Corroborating documentation was obtained from state archives or from the EPA’s library by the New York Times.It appears that though seemingly forgotten, EPA had been aware of at least one well documented case of drinking water well contamination from fracking for 25 years. In addition, there are reports from several states noting contamination of drinking water wells in association with fracking, though no definitive proof because of lack of adequate testing and difficulties in understanding groundwater, the full extent to which hydro fracking fluids have contaminated or might in the future contaminate groundwater is unknown. However, many cases of associated contamination have been confirmed.

Thursday, December 8, 2011

World CO2 Emissions and Durban


More than 10,000 ministers, officials, activists and scientists from 194 countries are meeting in Durban in what appears to be a last ditch attempt to extend the Koyoto treaty and to try and to try to tax all the developed nations to pay for climate impacts on poorer nations through the Green Fund for climate assistance. Durban, the 17th annual Conference of the Parties (COP17) to be held since the United Nations' first began to coordinate an attempt to control global warming through carbon dioxide control has reached the final stretch. At this point it appears that the conference will close without any agreement. The European Union refuses to extend without the United States and China committing and neither country appears likely to make any legally binding commitment. The Climate Change movement has lost its urgency. The failure to get any binding international agreement in Durban may be caused by the global economic problems or by the failure of the Global Warming/ Climate Change models to predict temperatures. Levels of greenhouse gases are higher than the worst-case scenario outlined by climate experts just four years ago, but temperatures have not risen as projected by the climate models.

The 1997 Kyoto Protocol bound developed countries to cuts of about 5-6% from 1990 levels in global emissions of greenhouse gases as represented by carbon dioxide by 2012. President George W. Bush rejected Kyoto in 2001, saying it did not impose emissions limits on emerging industrialized nations – chiefly China and India, and now China has surpassed the United States as the world largest emitter of greenhouse gases. China (6.9 billons tons in 2009), the United States (5.2 billion tons 2009), India, the Russian Federation (1.5 billion tons in 2009) and the European Union (3.0 billion tons in 2009) were the largest contributors to global emissions growth to a total of almost 30 billion tons of CO2 in 2009 (the specific breakout for 2010 was unavailable from the International Energy Agency, IEA, but the increase worldwide was about 6% 2010). Canada, who signed the Koyoto pact blew through their CO2 levels exceeding their 2000 levels and joined the United States as among the highest per capita emitters on the planet. Canada had agreed to cut emissions 6% below 1990 levels by 2012 as part of the Kyoto Protocol, but Canada’s emissions (0.7 billion tons in 2009) are now 17 % above 1990 levels, largely because of increased emissions related to the development of the Canadian oil industry. Canada failed to meet its Kyoto targets because they refused to take the large economic hit necessary for a big, cold, northern, sparsely populated, oil and natural gas producing nation to achieve them. There are no meaningful penalties for missing a Kyoto emission target. Even the most cooperative countries are missing their Kyoto targets.

However, Japan has been faithful to their word. Japan's Trade Ministry said on Tuesday emissions of CO2 fell 5.6 % to 1.075 billion tons in the year ended March 2010, bringing the Japanese below their Kyoto goal of 1.186 billion tons a year, when taking into account the volumes of carbon offsets Japan has bought from abroad. However, Japan announced that they are reconsidering plans to cut carbon-dioxide emissions by 25% by 2020 due to closing of a significant portion of its nuclear power generation, and the costs of the carbon-credit programs that cost the county almost $11 billion to purchase the carbon offsets by investing in carbon abatement programs in other countries.

The failure to get a binding international agreement in Durban has the Climate Model believers in a frenzy as CO2 emissions are up 6%, to over 30 billion tons, in 2010 40% above the 1990 level. This level of CO2 is higher than the worst-case scenario outlined by climate experts just four years ago. Securing a commitment from major polluters such as China and India to sign up to a Kyoto II in the future – a move spearheaded by the British and European Union Energy Secretaries appear doomed to failure. The failure to get a binding international agreement in Durban may be caused by the continuing steep rises in annual global CO2 emissions without an accompanying significant rise in global temperatures. Levels of greenhouse gases are higher than the worst-case scenario outlined by climate experts just four years ago, but temperatures have not risen as projected by the climate models. The relationship of climate change to worldwide CO2 levels may not be the one assumed in the climate models. In addition, the difficulty in reducing CO2 levels worldwide can be seen in the diagram above. Canada, Russia, and Japan withdrawing from the Koyoto Treaty and the United States not making a binding commitment despite President Obama’s commitment in Copenhagen to reduce United States emissions of CO2 17% by 2020 has doomed Durban.

Monday, December 5, 2011

Fracking in Ohio

The U.S. Forest Service has withdrawn more than 3,200 acres of forest land from a federal oil and gas lease sale scheduled for Wednesday, December 7, 2011. The acreage in Athens, Gallia, and Perry counties was to be included in a broader sale of leases for 20,949 acres of federal land in Ohio, Mississippi and Louisiana. This land was to be auctioned for Hydraulic Fracturing. The Buckeye Forest Council, an environmental coalition, opposed the sale stating that the environmental statement was outdated because it did not mention hydraulic fracturing. In addition, they feel that Ohio does not have the regulatory framework to deal safely with fracking. The auction plan is on hold pending the review of the environmental impact statement which could take up to six months and lead to required revisions in the 2006 environmental impact statement which could delay the auction further. The 3,200 acres currently have nearly 1,300 shallow gas wells.

Our ability to recover natural gas buried a mile or more beneath the earth has increased. Advances in horizontal drilling which allows a vertically drilled well to turn and run thousands of feet laterally through the earth combined with advances in hydraulic fracking, the pumping of millions of gallons of water and laced with thousands of gallons of chemicals into shale at high pressure have increased our ability to recover natural gas from shale. Hydraulic fracking is a technology that was unknown 60 years ago. Until recently there was no economically feasible way to extract this gas.

Thought industry executives say fracking has been widely used for decades without problems, hydraulic fracturing has changed, the type of hydraulic fracturing the industry executives are talking about is coal bed formation fracturing. The volume of water needed for hydraulic fracturing varies by site and type of formation. Fifty thousand to 350,000 gallons of water may be required to fracture one well in a coal bed formation while two to five million gallons of water may be necessary to fracture one horizontal well in a shale formation. Water used for fracturing fluids is acquired from surface water or groundwater in the local area. Wastewaters from the hydraulic fracturing process must be disposed of and several ways have been used. Several of the techniques tried have been to dispose of the water underground using injection wells, discharged to surface waters after treatment in a waste water treatment plant designed to remove only solids and biological contaminants, or applied to land surfaces where it can seep into the water table.

The millions of gallons of water used for fracking shale contain up to 15,000 gallons of chemical additives. The chemicals serve to increases the viscosity of the water to a gel-like consistency so that it can carry the propping agent (typically sand) into the fractures to hold them open so that the gas can flow. Determining the proper methods for the safe disposal of the large quantities of this fracking fluid that may also contain contaminants from the geological formation including brines, heavy metals, radionuclides and organic contaminants is essential. The deep well injection of the waste in Texas is believed by scientists to have triggered the earthquakes near the Dallas airport. The impact of so much waste water on our water resources must be measured and monitored. Finally, care must be taken to avoid degradation of watersheds and streams from the industry itself as large quantities of heavy equipment and supplies are moved on rural roads and placed on concrete pads.

There are many possible routes to contamination from fracking. Errors in natural gas well construction or spills during injection can occur and lead to drinking water contamination. Drinking water wells contaminated by methane and potassium chloride have been reported. In Pennsylvania, flammable levels of methane in drinking water wells and potassium chloride levels high enough to salinize a drinking water aquifer have been reported in the vicinity of some gas wells. Fracking fluids can spill before they are injected and fluids recovered from fracturing can contaminate surface waters. The EPA estimates that 15-80% of the volume of fracking fluids injected will be recovered. The amount of fluid recovered depends on the site geology. Additionally, drilling into the subsurface through the water table can create pathways for fracking fluids or natural gas to find its way into water supplies and wells, if grouting isn’t properly done and the gas well properly constructed. The horizontal sections of the wells are not cased in cement and, introduce a potential point where fracking fluids can reach the outside of the grouting during flowback.

Hydraulic fracturing should continue slowly. A limited number of wells should be installed with careful monitoring of local and regional groundwater supplies as well as verification of proper well construction and wastewater recycling. Limiting fracking to a small area of the federal and state forest lands would allow the development of experience, knowledge and data, and could ensure careful restoration of the area. Instead of leaving unwary homeowners to the “land men” and their leases written entirely to favor and protect the drilling and gas companies, allow the state governments to develop standard language for the gas leases and the federal government to collect real time data in a secluded area away from residential impact.

Currently, the US Environmental Protection Agency (EPA) is studying the impact of hydraulic fracturing on water resources, but they are only focusing on the potential to directly pollute the drinking aquifer, not looking at potential changes in the groundwater hydrology. The geological impact of Hydraulic Fracturing should be examined by the U.S. Geological Survey. No one has ever looked at what the long term implications are for the hydraulic balance when fracking occurs. The removal of millions of gallons of water, the fracturing of the geological formations, and the injection of contaminants even at low concentrations into the subsurface could cause significant changes in groundwater flow and quality.

The current regulatory framework concerning hydraulic fracturing has a number of gaps that need to be addressed before unlimited fracking takes place. There were several recommendations made in the report of the Shale Gas Subcommittee of the Secretary of Energy Advisory Board. The report had a rational approach to regulation recommending disclosure, testing, evaluation and modification of regulation and practices based on the information and data obtained. It assumes information and data will be gathered and analyzed. That is not yet being done. The data needs to be collected on a state level and provided to the US Geological Survey and US EPA to consolidate on a national level.

In the past decade the advances in drilling and fracking technology have been adapted to exploit gas in the Barnett shale in the Fort Worth Basin in Texas and applied to a series of major shale gas deposits that could not have been viable without the advances in drilling and fracking. The Fayetteville shale, the Haynesville shale, the Marcellus shale reserves all in the United States and the Horn River shale reserves in Canada are now accessible. At the current rate of natural gas consumption North America is reported to have a 100-year supply of proven, producible reserves and even with expanded use of natural gas, there is more than a generation of currently accessible reserves. We need to treat both the earth and its resources with respect.

In truth we have no viable option to hydrocarbon fuel. When the oil and gas is gone it will be a poorer future without airplanes, freighters and trucks. Sailing ships will not transport raw materials and finished goods around the earth. Solar and wind power will produce unreliable power supplies and mankind will adapt (not happily) or discover new sources of fuel. Before that future world arrives, the shale gas and oil sands and whatever else is discovered will be exploited. There is no urgency, but you cannot permanently stop that trend. These deposits will become more valuable over time as the world becomes more desperate for energy. Now is the time to carefully develop and study the methods to exploit these resources without destroying or further damaging the earth.

Thursday, December 1, 2011

United Nations Climate Summit and More Emails from East Anglia

United Nations Climate summit in Durban, South Africa began on Monday, November 28, 2011 and will run until December 9th 2011. The Durban meeting is the 17th conference of the parties to the United Nations convention on climate change or COP17. This international meeting may have been entirely ignored by the general public given the economic turmoil in Europe and the United States, but for the release last week of a new batch of emails reported to have been stolen from the servers at the University of East Anglia.

I spent couple of hours randomly reading emails and did not find any new insights. This appears to be another group of emails very much overlapping the 2009 hacked emails from the University of East Anglia's Climate Research Unit (CRU) a collaborator with the U.N.'s Intergovernmental Panel on Climate Change. The released emails revealed some researchers willingness to suppress or massage data and use the peer-review process to control the publication of scholarly work and suppress the publication of dissenting points of view. The hacked emails have shown some of the weaknesses in the climate data and models used to forecast global warming, as well as some rather questionable behavior by scientists in controlling information provided to the public even to the extent of reviewing and approving the BBC reports. There really does not appear to be much that is new in this latest group of emails, though the new emphasis on BBC’s lack of objective reporting seemed new to me.

After the first release of emails, the Intergovernmental Panel on Climate Change (IPCC) investigated the claims of scientific wrongdoing. In its report in August 2010, it recommended improvements in the management structure of the IPCC, ensuring that the data included in its reports had been properly published in the scientific literature, and finally that the full range of scientific opinion should be reflected in the reports. Nonetheless, IPCC confirmed their conclusions that the earth is warming and that activities of mankind have caused this warming. Berkeley Earth Surface Temperature Study finished their analysis of global temperature studies this past fall and confirmed global warming since the industrial revolution. EPA Administrator, Lisa Jackson, never wavered from the EPA’s full acceptance of findings reached by outside groups, including the Intergovernmental Panel on Climate Change that Administrator Jackson explained "relied on decades of sound, peer-reviewed, extensively evaluated scientific data that the combined emissions of …greenhouse gases in the atmosphere threaten the public health and welfare of current and future generations." The EPA has proceeded to create a number of new regulations including 2025 targets for auto mileage and power plant emissions standards on mercury after putting the direct greenhouse gas regulation of power plants on hold this past fall.

This month the Department of Energy, DOE, reported that in 2009-2010 the world pumped out almost 6% more carbon dioxide than during the previous year. According to the DOE on their Carbon Dioxide Information Analysis Center web site the increase is due to increased emissions from the People's Republic of China. Since 2001 global carbon dioxide emissions worldwide have increased 33%. During this same period; however, U.S. carbon dioxide emissions have not increased, our national impact has become less relevant. From 2001-2010 global temperatures have not increased, but remain approximately 1.13°F warmer than the average global surface temperature from 1951 to 1980. To measure climate change, scientists look at long-term trends. The temperature trend, including data from 2010, shows the climate has warmed by approximately 0.36°F per decade since the late 1970s. Carbon dioxide has shown a less direct relationship to global temperatures than the climate models had predicted.

The U.S. never ratified Kyoto, arguing it should contain 2012 goals for emerging economies and would cost U.S. jobs. The U.S. has also failed to adopt a comprehensive domestic program for reducing its own greenhouse gas emissions, despite recent regulatory activity by the EPA and DOE funding of billions of dollars of solar projects disguised as a loan guarantee program. Nonetheless, the American Clean Energy and Security Act,” also known as the Waxman-Markley energy bill, failed to pass the senate in 2010 and is seen as dead. Thought, the California Air Resources Board unanimously voted to adopt the nation's first state-administered cap-and-trade regulations for greenhouse gases in 2011. Cap-and-trade is the centerpiece of AB 32, the Global Warming Solutions Act of 2006 a California law that is designed to achieve quantifiable, reductions of greenhouse gases. At the Copenhagen meeting in 2010 President Obama pledged to reduce U.S greenhouse gas emissions to 17% below the 2005 levels by 2020. Due to the recent drop in industrial production and electricity usage, we have already cut U.S. emissions by 6%; the Administration is well on its way to achieving this goal. The California Cap and Trade program requirements will help the current crop of California renewable energy projects funded under the DOE program to reduce power consumption in California by increasing the cost of electricity by at least 15% above the cost of using natural gas according to the Division of Ratepayer Advocates at the California PUC.

The newest release of hacked emails serves to turn our attention to the proceedings in Durban more than anything else. The Kyoto Protocol, which committed developed nations to cut their emissions, is set to expire in 2012. After both the Copenhagen (2009) and Cancun (2010) Climate summits failed to produce a legally binding climate treaty, delegates to the Durban talks are under immense pressure to produce some kind of deal that will be acceptable to both rich and developing nations. However, it is reported that cap-and-trade concept is losing support among the previous signers of the Kyoto treaty. Canada, Japan and Russia have stated that they will not agree to an extension of Kyoto unless China, India and Brazil who are now major producers of greenhouse gas become subject to the requirements.

The “emerging nations,” including China, India and Brazil want an extension of Kyoto, which required the industrialized nations to cut greenhouse gas emissions by 5.2% below 1990 levels from 2008-12. The world's two largest greenhouse gas emitters are China and the United States. China because of concern about employment and a slowing international demand for their products have no interest in having any climate treaty apply to their nation. China’s economy appears to be slowing down significantly based on the falling global demand for oil and copper. China was exempted as an emerging economy, and though it is now the largest greenhouse gas emitter on earth, it wants to remain exempted from reducing or even stabilizing greenhouse gas emissions under any new agreement. In September India announced that it would not accept any legally binding limits on greenhouse gas emission, and Japan announced that they are reconsidering plans to cut carbon-dioxide emissions by 25% by 2020 due to closing of a significant portion of its nuclear power generation, and the costs of the carbon-credit programs that required the spending of almost $11 billion on carbon abatement programs in other countries during a decade long economic malaise.

Overall, expectations for the future of the Kyoto Protocol are low and some doubt whether if a second commitment period is feasible with only support from EU which accounts only around 11% of the world’s greenhouse gas emissions and is itself reconsidering its nuclear power generation after the Fukushima Daiichi nuclear reactors were damaged after the recent earthquake. If nuclear reactors are going to be phased out as low greenhouse gas emission power generation there is no way to achieve carbon reductions without reducing the size of the economy, the standard of living or the size of the population. During her opening remarks to the conference, Executive Secretary of the UN Framework Conventions on Climate Change Christiana Figueres said countries can take two major steps in Durban to address climate change. The first is completing a comprehensive package to help developing countries adapt to climate change and limit the growth of their greenhouse gas emissions, and the second relates to how governments can work together to limit the global temperature rise and thus prevent further natural disasters. This seems to be a stepping away from the more rigorous stance of previous conferences.

Monday, November 28, 2011

Fracking in New York

Last year, New York placed a moratorium on drilling in the Marcellus Shale while it assessed the effects of fracking. New York Department of Environmental Conservation’s draft environmental impact statement (EIS) on drilling was released almost three months ago and recommends that drilling be permitted, but with conditions. The comment period ends on December 12, 2011 and most likely the ban on hydro fracking in New York will end with it despite several groups’ attempts to extend the comment period three more months.

The EIS places restrictions on drillers to address groundwater concerns. EIS mandates that drillers must not drill within a certain distance of watersheds or aquifers and more stringent well construction standards be met. These recommendations are in line with the recommendations issued by the Shale Gas Subcommittee of the Secretary of Energy Advisory Board this past spring. The report had a rational approach to regulation recommending disclosure, testing, evaluation and modification of regulation and practices based on the information and data obtained. The report is to some extent a collection of the best regulatory framework among the states and covers little new ground overlooking some of the significant questions. This was a subcommittee at the Department of Energy that reports to the Secretary of Energy. However, EPA will be the regulatory agency and is currently engaged in a multi-year study of hydraulic fracturing. There is not enough data to fully understand the full impacts of fracking.

There is tremendous pressure to lift the moratorium on fracking. A large swath of New York sits atop the Marcellus Shale, which is the third-largest natural gas field currently known in the world. The Marcellus Shale alone is estimated to be 500-trillion-cubic-feet of gas reserve. This resource could heat our homes for a generation or more, and power our electrical generating plants, even fuel cars either directly or through plug in hybrids. The possible impacts to our economy and environment are far reaching. The potential risks are also far reaching.

Our ability to recover natural gas buried a mile or more beneath the earth has increased. Advances in horizontal drilling which allows a vertically drilled well to turn and run thousands of feet laterally through the earth combined with advances in hydraulic fracking, the pumping of millions of gallons of chemicals and water into shale at high pressure have increased our ability to recover natural gas from shale. Hydraulic fracking is a technology that was unknown 60 years ago and advances in the past 15 years have made it possible to economically access this gas. Our knowledge of the impacts from fracking has lagged behind our ability to access the gas.

In hydraulic fracking on average 2-3 million gallons of chemicals and water is pumped into the shale formation at 9,000 pounds per square inch and literally cracks the shale or breaks open existing cracks and allows the trapped natural gas to flow. While geologists and engineers believe that there is little risk that the fracking “water,” a mix chemicals and water, will somehow infiltrate groundwater reserves though a fissure created by the fracking there are other routes of contamination and impact. It is believed that the intervening layers of rock would prevent a fissure from extending thousands of feet to the water table; there are other risks in how we build wells and fracture the shale that the EIS attempts to address.

There have been documented cases of seepage into drinking water wells through improperly sealed or abandoned drilling wells. An ongoing monitoring and data collection program needs to be part of the permitting process. Potential impacts to our water supply from hydraulic fracking needs to be studied over time and regulations modified to better protect our water supplies and natural resources as fracking expands in the region. Drilling requires large amounts of water to create a circulating mud that cools the bit and carries the rock cuttings out of the borehole. After drilling, the shale formation is then stimulated by hydraulic fracking, using up to 3 million gallons of water.

Data needs to be gathered on the impact to water resources of supplying water for the construction of thousands of wells per year. For gas to flow out of the shale, nearly all of the water injected into the well during fracking must be recovered and disposed of. Though less than 0.5% by volume, the proprietary chemicals can account for 15,000 gallons in the waste from a hydro fracking job. The chemicals serve to increases the viscosity of the water to a gel-like consistency so that it can carry the propping agent (typically sand) into the fractures to hold them open so that the gas can flow. Determining the proper methods for the safe disposal of the large quantities of this fracking fluid that may also contain contaminants from the geological formation including brines, heavy metals, radionuclides and organic contaminants and monitoring the impact from this disposal must also be done. The impact of so much waste water on our water resources must be measured and monitored. Finally, care must be taken to avoid degradation of watersheds and streams from the industry itself as large quantities of heavy equipment and supplies are moved on rural roads and placed on concrete pads. The watersheds must be monitored and permitting should not exceed our ability to monitor the impacts.

Thursday, November 24, 2011

Who will Control your Water


Fresh water supply poses a real and looming environmental risk. Regional shortages of water will drive decisions that will impact our future.

According to the US Census Bureau there are 312 million people in the United States. The water that exists on the planet is finite, but always moving as part of the water cycle or hydrologic cycle, on, above, and below the surface of the Earth. The good news about water is that “on average” the United States uses less than 8% of the water that falls as precipitation within our borders annually. Unfortunately, precipitation varies from that average significantly on a regional basis and over time, and our need for water is often greatest where there is the least precipitation because of the need for irrigation. In addition, only the cities on the great lakes have adequate precipitation and water storage to supply their population’s water needs, so our urban center have become very used to thinking of appropriating water from nearby regions to the cities.

As population rises, the demand for fresh water for drinking, domestic use, for industry (especially power generation) and for agriculture increases. The demand for food and the water that is essential to produce food grows with population and wealth. Globally, farming is estimated to account for 60% -70% of fresh water use. Irrigated agricultural consumes over 75% of the water in California, which produces 17.6 % of U.S. crops, and 7 % of the U.S. livestock and livestock products. California produces about half of U.S. grown fruits, nuts, and vegetables. Several of these crops are currently produced only in California. In the United States we have used the various complicated, layered and hidden subsidies within the various “farm bills” and subsidized water to complicate the business of farming and obscure the true costs of food in America.

This past spring, even as the Mississippi River basin was inundated with water, large portions of the arid west were struggling with drought. Farmers in the west pumped groundwater (unsustainably) to produce their crops. Regional water supply and allocation of that water is a growing problem especially in the western states which are arid, dependent on irrigation and have multi-state water right compacts. One of the best known of these Compacts is the 1922 Colorado River Compact, negotiated by the seven basin states (Colorado, Nevada, Utah, New Mexico, Wyoming, Arizona, California, ) divided the Colorado River basin into upper and lower portions, allotted consumptive use of the Colorado’s water on the basis of territory rather than prior appropriation. Before this agreement was negotiated allocation of water rights (ownership) was based on historic use, first to use the water owned it in perpetuity. In a land where water was wealth and all water was diverted from its natural location, this was how it was done. The allocation of water rights based on territory allowed development to proceed in the lower basin (essentially California) while safeguarding supplies for the upper basin. Then, as now, California's growth and demand for water was viewed with concern by her neighbors.

The problem is that the allocations promised were more than 100% of the water available and the demand for water has exceeded the supply. Specifically, the amount of water allocated under the Colorado Compact was based on an expectation that the river's average flow was 16.4 million acre feet per year. Subsequent tree ring studies, however, have concluded that the long-term average water flow of the Colorado is significantly less. According to the University of Arizona, a better estimate would have been 13.2 million acre feet at the time of the Colorado Compact and the records going back to paleolithic times (more than 10,000 years ago) indicates periods of mega-droughts in the distant past. During the drought of 2001-2006 the Colorado River flow was estimated at 11 million acre feet and hit a low of 6 million acre feet in 2002. The situation was critical bordering on regional rationing when the drought ended. More than 23 million people of the lower basin are at least partially dependent upon the water resources of the Colorado River. Almost 74% of them reside in the greater Los Angeles and San Diego areas. The deep snow pact and rain of last winter in northern California has taken has taken emergency rationing off the table- until the next drought.

Population growth, increased food production and increased power production all consume more and more water. The water available from the Colorado River has not increased with the increased demand and may even be falling. Even without climate change, paleoclimate records show a history of tremendous droughts in the region, and now more than 35 million people (in the upper and lower basins) depend upon the Colorado River’s waters for their water supply. The need for water is always growing. California is the most populous state in the nation and Nevada was identified as the fastest-growing state in the country in the 2010 census growing over 35% since 2000. Despite aggressive conservation activities the region simply does not have enough water to meet the projected demand. Las Vegas, was in the midst of a building boom when the drought hit. While adding 400,000 people they were able to reduce water use by a third by the implementation of draconian conservation measures. This was city and suburban consumption, not agricultural or power generation use of water which is much more difficult to cut.

The states of the Colorado Compact need more water. Overuse is killing the Colorado water basin which suffers from decimated aquatic ecosystems, overdrawn and irreparably damaged groundwater aquifers, and polluted agricultural and urban runoff. California has focused all its attention on developing a plan for reducing carbon dioxide emissions which is unlikely to prevent climate change, but they have failed to develop a workable water budget (or a balanced state budget for that matter). For two decades the Pacific Institute has called for a revamp of river management to protect endangered fish species and critical ecosystem elements, free up water for restoration of the Colorado River delta, and eliminate long-term groundwater overdraft throughout the basin. California and the other Colorado Compact states could not face the simple fact of a limited water supply and ignored the warnings, preferring to think about that tomorrow.

Even the conservation measures implemented in Las Vegas and throughout the region are not enough to ensure the long term water supply. The Southern Nevada Water Authority has requested to build a pipeline to transfer 65 billion gallons of water from northern Nevada to Las Vegas. The state will decide in January whether to proceed with that plan. The project has encountered stiff opposition from conservationists and rural communities against tapping northern groundwater to fuel more growth in southern Nevada. The pressure to push the project forward is off after the large snow pact of last winter inundated the area in the spring thaw and filled Lake Mead for the first time in a decade. Lake Meade sits on the Nevada-Arizona border and was formed in 1935 after the construction of Hoover Dam. Lake Mead and the upstream Lake Powell are the major water storage facilities in the Colorado Compact system. Roughly 96% of Lake Mead's water comes from melted snow in the upper Colorado River basin states: Colorado, Utah, New Mexico and Wyoming.

Las Vegas is only one small area of the Colorado Compact. Regional politics demands maintaining a vibrant agricultural sector, quenching the thirst of growing urban and suburban, growing economies that also demand water for power and industry, despite the limitations of the water supply. Politicians do not seem able to make the hard choices that will balance their water budgets. Instead the politicians came up with the idea to investigate the “Long-Term Augmentation of the Water Supply of the Colorado River System.” The study commissioned by the Colorado Compact states and the federal government identified 12 long-term augmentation options: desalination of both brackish water and ocean water, coalbed methane produced water, recharging groundwater from other surface sources, reduction of consumptive use of water for power generation, reservoir evaporation reduction, storm water storage, vegetation management, importing water via boat, water reuse, weather modification, and importation of water from the Midwest. Former Governor of New Mexico, Bill Richardson suggested “compacts” with the great lake states to import water to the drier western states under a federal water Czar. One of the ideas explored by the Southern Nevada Water Authority is to pipe 1,000 cubic feet of water per second from the Mississippi River 1,000 miles west to the Colorado River. They estimated that this aqueduct-pipeline would cost $11.4 billion to construct and an unknown amount of money to operate and maintain. Pat Mulroy, general manager of the Southern Nevada Water Authority, who is responsible for ensuring that the 2 million residents of Las Vegas have water argues that this plan could flood proof the Mississippi River Basin while recharging the depleted Ogallala Aquifer under the Great Plains and maintain and increase agriculture on the eastern side of the Colorado River. The plan is to remake nature with a modern era of big infrastructure projects rather than accept the limits of nature and locating large water use projects where water is plentiful. Water control and allocation would be another federal power under this water augmentation plan.

Monday, November 21, 2011

Solyandra was a Loan Not a Venture Capital Investment

Thursday, Energy Secretary Steven Chu sat through more than five hours of questioning by the oversight panel of the House Energy and Commerce Committee about the failure of Solyndra. He deftly danced around charges of incompetence discussing Solyndra using such phrases such as “cash burn rate”, “start up” and “build up sales,” and said the White House has not lost faith in him. The committee and Secretary Chu seemed to have missed the point. This was a loan guarantee program. This was not a venture capital fund. This was not supposed to be a government investment in Solyndra or any other company (Beacon Power for example), but a loan guarantee program to aid viable projects in obtaining loans to build commercial scale projects.

The federal stimulus bill signed by President Obama expanded Title XVII of the Energy Policy Act of 2005 by adding Section 1705. DOE describes Title XVII Section 1705 as “ Provides loan guarantees to commercial-scale renewable energy projects, that begin construction prior to September 30, 2011 in Biomass, Hydrogen, Solar, Wind/Hydropower, Geothermal, Transmission, or any other renewable energy systems.” This was clearly a loan guarantee.

All loans typically have a primary and secondary source of repayment. The primary source of repayment is demonstrated or reliably projected cash flow. This is cash generated from the business or project. The secondary source of repayment is “conversion of the collateral,” that would be selling the assets of the company. Loan guarantees are necessary when either the primary or secondary source of repayment is impaired. Loans are made with borrowed funds, banks or other lenders borrow money in the financial markets and lend it to businesses at between 0.5% and 2.5% above their cost of funds.

The less risky the loan the less the lender’s spread. A government guarantee would essentially make a loan almost riskless and provide the secondary source of repayment, the U.S. taxpayer. A loan guarantee program provides a guarantee to reduce the interest rate charged and thus the borrowing costs. In order to protect the U.S. taxpayer from excessive losses in the DOE Title XVII Section 1705 loan program, it was essential to make sure the projects had a primary source of repayment, a sound source of cash flow.

The DOE program provided the loan guarantees for free. However, Secretary Chu, the entire administration, the House Energy and Commerce Committee and the press seem to have forgotten that the DOE Title XVII Section 1705 was a loan guarantee program not a venture capital fund. The Solyndra loan appears to have not primary source of repayment, was subject to regulatory and incentive risk and had limited secondary source or repayment. This was not a loan, yet $535 million of taxpayer money was at risk.

Venture capital is equity provided to early-stage, high-potential, high risk, start-up companies. The target return on Venture capital funds is typically 20%-35% and the venture capital investor is buying portions of companies. Venture capital is used to grow and develop companies with limited operating history that have not yet reached the point where they are able to obtain a bank or other type of loan by demonstrating the ability to make a profit. In exchange for the high risks that venture capitalists assume by investing in riskier companies, venture capitalists usually get significant control over company decisions, and a significant portion of the company's ownership (and consequently value). A venture capital fund makes money by selling the equity in the successful companies it invests in.

A Title XVII Section 1705 loan guarantee for $535 million loan guarantee given to Solyndra was not a venture capital investment by the DOE. The DOE took no ownership of the Solyndra, they simply guaranteed the company’s debt. Solyndra had no cash flow from their existing facility and were not profitable. Building a bigger and highly automated manufacturing facility was a wildly speculative attempt to build a market for a more expensive product. Title XVII Section 1705 was clearly a loan guarantee program being misused, not venture capital fund.

Thursday, November 17, 2011

Keystone XL and Canadian Oil Sands

The Canadian oil sands have been known for decades. Until the recent protests against the Keystone XL pipeline that labeled these oil reserves “Canadian Oil Sands,” they had been variously known as unconventional oil or crude bitumen, the Canadians use oilsands as a single word. These oil sands had been surfaced mined in Canada with drag lines and power shovels since the late 1960’s, but until oil prices rose and technology improved these oil deposits were too expensive to exploit beyond the limited scope of surface mining. Advances in technology in both oil sand extraction and refining techniques and rising oil prices altered the economics and have made the extraction of oil sand possible.

The crude bitumen contained in the Canadian oil sands is a semi-solid or solid in natural deposits. It is a thick, sticky form of crude oil, so heavy and viscous that it will not flow unless heated or diluted with lighter hydrocarbons. Decades ago Canadian oil companies discovered that if they removed the sand filters from the well pumps and pumped as much sand as possible with the oil, production rates improved remarkably. This technique became known as Cold Heavy Oil Production with Sand (CHOPS). Pumping out sand opened "wormholes" in the sand formation which allowed more oil to reach the well improving production rates and recovery from around 6% to 10%. However, it produced large quantities of sand with oil residue that need to be disposed of, the recently used method has been to dispose of them in underground salt caverns.

More advances in drilling techniques and the use of steam injection have allowed the Canadians to expand their recoverable oil. In Cyclic Steam Stimulation (CSS) steam at extremely high temperature is injected into a well over a period of weeks to months; then, the well is allowed to rest while the heat to soaks into the formation. Finally, the hot oil is pumped out of the well for weeks or months until the production rate falls off. Once the production rate falls off, the well is put through another cycle of steam injection, rest and production. CSS and has a recovery rate around 20 to 25%; the disadvantage is that the cost to inject steam is high.

Steam Assisted Gravity Drainage (SAGD) was developed after improvements in directional drilling technology made it possible. In SAGD, two horizontal wells are drilled in the oil sands, one at the bottom of the formation and another about 15-20 feet above it. Groups of wells are typically drilled off a central pad and like fracking wells can extend for miles in all directions. This reduces surface disturbances of the land and the footprint of the area to be reclaimed under the environmental license (the Canadian version of a permit). In each well pair, steam is injected into the upper well melting the bitumen, which flows into the lower well and is pumped to the surface. SAGD was the breakthrough that has quadrupled recoverable oil reserves and moved Canada into second place in proved world oil reserves. SAGD is cheaper than CSS, allows very high oil production rates, and recovers up to 60% of the oil in place. There are refinements in the technology using in-situ hydrocarbon dilution under development that could reduce cost and energy used in mining even further, and could further reduce the cost of extracting oil sands. It is the SAGD method, however; that has created the need or desire for a pipeline to deliver the oil to the American markets.

Like all petroleum production, oil sands operations can adversely impact the environment. In the past open pit mining of oil sands projects have impacted the land when trees, brush and overburden have been removed for the mining site. As a condition of licensing, projects are required to implement a reclamation plan, but reclamation is a slow process. The mining industry asserts that the boreal forest will eventually recolonize the reclaimed lands. In addition, large amounts of water are used for oil sands operations for the steam in the current SAGD method. Despite recycling, most of the water ends up in tailings ponds. The Alberta provincial government limits how much water oil sands companies can remove from the Athabasca River to avoid impact and newer treatment methods have reduced the treatment and recovery time for tailing ponds. Still environmental regulations need to evolve with technology. Last winter the Canadian press reported that Wikileaks released a cable written by the U.S. Ambassador to Canadian Environment Minister in 2009 that revealed that the Obama administration had inquired about a possible moratorium on new oil sands development. Former environment minister Jim Prentice responded (in 2009) to the U.S. Ambassador that he was prepared to step in and impose tougher regulations on the oil sands if the industry damaged Canada's green reputation and said that if industry did not take voluntary measures and the provincial government did not set more stringent regulations, he would step in and press federal environmental legislation.

Recently, the current Canadian Environment Minister Peter Kent announced that Ottawa will introduce environmental regulations to address oil sands and reduce greenhouse gas emissions without implementing a cap-and-trade program. Canada has committed to reducing greenhouse gas emissions by 17% below 2005 levels by 2020, the same target that the United States has committed to. Environmentalists contend that emissions trends suggest the expansion of the oil sands will prevent Canada from hitting its targets, unless tougher environmental rules are put in place, and strongly oppose further development of oil sands until a stronger regulatory framework is in place . These groups are fighting to stop the Keystone pipelines to the United States and western Canadian ports as a method of stopping the expansion of oil sands production. The Pembina Institute in Alberta states: “Filling the proposed KXL pipeline with oil sands will result in nearly a 50% increase in oil sands production. Until environmental management of the oil sands is improved, KXL will cause significant environmental harm due to increased oil sands production.”

In June 2010 the first phase of the Keystone Pipeline System went into operation moving crude oil from Canada to market hubs in the U.S. Midwest. Keystone Cushing (Phase II of the pipeline) extending the pipeline went into service in February 2011, connecting the storage and distribution facilities at Cushing, to the Midwestern hubs. The proposed Keystone XL, is an approximate 1,660 mile, 36 inch crude oil pipeline that would begin in Alberta and extend southeast through Saskatchewan, Montana, South Dakota and Nebraska continuing through Oklahoma to an existing terminal not far from Port Arthur, Texas. The oil would arrive at the Texas refineries and ports for American market and export. The U.S. State Department is the lead handling the issue because the pipeline crosses national boundaries, but President Obama has made it clear he will make the final decision on whether to approve the pipeline.

Recently, the Canadian Prime Minister Harper told reporters the project would create a vast number of jobs in Canada and the United States, and he fully supported the project. President Obama has said environmental issues would weigh just as heavily in any decision as job creation and energy security. The pipeline was originally planned to run through the Osgallala aquifer in Nebraska, a very important water source to mid-west agriculture. On Monday in response to U.S. State Department indications that the pipeline needed to avoid the Osgallala aquifer and the Sand Hills area, TransCanada (the pipeline owner) announced it had reached a tentative deal with Nebraska officials to move the proposed route of its Keystone XL pipeline away from Osgallala aquifer. After the announcement the U.S. State Department made it clear that another environmental assessment would be necessary and would take 12 to 18 months, pushing the decision to 2013. A decision should never be made too soon or too late.