Thursday, January 22, 2015

Test Your Home for Radon

The U.S. Environmental Protection Agency (EPA) has named January as national Radon Action Month, in hopes of getting as many people as possible to test their homes for radon. Radon is a naturally occurring radioactive gas produced by the breakdown of uranium, thorium, radium, and other radioactive elements that naturally occur in granites as well as some metamorphic and sedimentary rocks in soil, rock, and water and is widespread in the United States. Radon is an odorless, clear radioactive gas that can cause cancer. Most people only test their home at purchase, but the It is a good idea to retest your home if you make any changes to the structure and every few years to be sure radon levels remain low. In addition, if your home has a radon mitigation system, it is important to monitor the system and retest at least every two years to make sure the system is functioning.

According to the EPA about 21,000 people die each year from lung cancer caused by long term exposure to elevated levels of radon in their homes. Radon is the second leading cause of lung cancer in the general population, and is the leading cause of lung cancer in non-smokers. As radon gas is released from bedrock, it migrates upward through the soil and can seep into the basements of houses and other buildings through dirt floors, cracks in concrete, and floor drains. Radon has a tendency to accumulate in enclosed spaces such as buildings. Air pressure inside your home is usually lower than pressure in the soil around your home's foundation. Because of this difference in pressure, your home acts like a vacuum, drawing radon in through foundation cracks and other openings. Radon, in its natural state cannot be detected with human senses- you cannot see, taste or smell it. The only way to detect radon is to test.

Ambient levels of about 0.4 picocuries per liter, pCi/L, of radon are typically found in the outside air. EPA recommends mitigating radon if the results of one long-term test or the average of two short-term tests show radon levels of 4 pCi/L or higher. According to the EPA, radon levels in most homes can be reduced to 2 pCi/L or below using standard mitigation techniques. Short term radon testing kits consist of a container of granular activated charcoal. The charcoal absorbs the radon gas entering the canister from the surrounding air. At the end of the radon gas test period, typically 3-7 days the canister is sealed and sent to the laboratory in the pre-paid mailer for analysis. There are also 90 day test kits. I have been thinking of trying one of those this winter.

Radon mitigation takes one of two approaches either preventing the radon from entering the home or reducing the radon levels by dilution after the radon has entered the home. There are several techniques that can be used depending on the type of foundation the home has. It is better to prevent radon from entering the home in the first place so I will discuss the preferred methods of prevention. The type of foundation, construction materials and condition will determine the kind of radon reduction system that will work best. Homes are built with some kind of foundation- a basement, slab-on-grade, a crawlspace, or a combination of the three. It is common to have a basement under part of the home and to have a slab-on-grade or crawlspace under the rest of the home. In these situations a combination of radon reduction techniques may be needed to reduce radon levels to below 4 pCi/L, which the EPA says is a safe level, but be aware that there is a synergistic risk from active smoking and radon exposure that increases the risk of getting lung cancer.

Soil suction techniques are the preferred method of mitigation and prevents radon from entering your home by drawing the radon from below the home and venting it through a pipe(s) to the air above the home or outside the house where it is diluted by the ambient air. An effective method to reduce radon levels homes with crawl spaces is covering the dirt floor of the crawl space with a high-density plastic sheet. A vent pipe and fan are then installed and used to draw the radon from under the sheet and vent it to the outdoors. This is called sub-membrane suction, and according to the EPA when properly installed is the most effective way to reduce radon levels home with crawlspaces.

In homes with concrete slab foundations or basements, sub-slab depressurization is the most reliable radon reduction method. One or more suction pipes are inserted through the floor slab into the crushed rock or soil underneath the home and a fan is used to draw the radon from under the slab or basement floor to a roof or wall vent. It is possible, and in many cases preferable, to install the suction pipe under the slab by running the pipe on the outside of the house. Another variation is to use the drain tiles or perforated pipe that are installed in modern homes to keep basements dry. Suction on these tiles or pipes can be effective in reducing radon levels. This system is most effective if the drain tiles are on the inside of the footer, sealed beneath the floor and form a complete loop around the foundation of the building. In homes that have sump pumps the sump can be capped so that it can continue to drain water and serve as the location for a radon suction pipe. There are kits that can be purchased for capping the sump pump. It is important that the sump cover lid is readily removable for service of the sump pump.

There are several other techniques such as sealing cracks and passive methods that are often installed in new construction that are not as effective as active depressurization of the slab, basement or crawl space. As a temporary measure ventilation will reduce the radon levels by introducing more outside air, but it will increase your heating and cooling bills. After a mitigation system is installed do confirmation testing of radon levels before you make the last payment to the contractor to ensure that the mitigation system works. For more information of mitigation approaches and techniques see the EPA’s Consumer's Guide to Radon Reduction .

Monday, January 19, 2015

Methane Regulation Coming Our Way

from EPA
Last Wednesday, the U.S. Environmental Protection Agency (EPA) officially announced the next set of regulations for the United States to address climate change; EPA will set standards for methane and VOC emissions from new and modified oil and gas wells, and natural gas processing and transmission plants. By summer of 2015 EPA will issue a proposed rule and a final rule will follow in 2016. As with the power sector, EPA plans to first regulate new methane emissions, then circle back and regulate the existing sources of methane emissions.

Regulation of existing oil and gas wells will begin ahead of EPA regulations for the oil and gas industry. Using the Department of Interior’s Bureau of Land Management (BLM), the Administration will begin tightening the regulations on existing oil and gas wells by toughening the standards for operating gas and oil wells on federal land. The new standards will be designed to reduce venting, flaring, and leaks of natural gas, which is primarily methane, from these oil and gas wells. These standards, to be proposed this spring, will address both new and existing oil and gas wells on public lands and will serve as a test run on regulating existing oil and gas wells.

The Department of Transportation’s Pipeline and Hazardous Materials Safety Administration will propose natural gas pipeline safety standards in 2015 aimed at reducing leaks and releases from pipelines. The Department of Energy (DOE) will develop and demonstrate more cost-effective technologies to detect and reduce losses from natural gas transmission and distribution systems that is believed to represent over 22% of methane gas losses in the sector. The DOE effort will include efforts to repair leaks and develop the next generation of compressors. According to the EPA the President’s budget will propose $10 million to launch a program at DOE to examine the scope of leaks from gas distribution systems, pipelines and compressor plants to examine their contribution to global warming.

Two years ago Robert B. Jackson, Professor of Global Environmental Change at Duke University and Nathan Phillips, associate professor at Boston University Department of Earth and Environment collaborated with Robert Ackley of Gas Safety Inc., and Eric Crosson of Picarro Inc., to perform a study of gas leaks in Boston. They mapped the gas leaks under the city using a new, high-precision methane analyzer. The researchers discovered 3,356 leaks. The leaks were found to be associated with old cast-iron underground pipes, infrastructure that had not been maintained. The team went on to document leaks in Washington DC, but EPA wants to quantify the emissions of the entire wholesale and retail distribution system. In addition to the explosion hazard, methane, the primary ingredient of natural gas, is a powerful greenhouse gas that degrades air quality. Leaks in the United States are reported to contribute to $3 billion of lost and unaccounted for natural gas each year.

The White House set a new target for the U.S. to cut methane emissions in the energy sector by 40% to 45% by 2025, compared with 2012 levels. Methane emissions in the energy sector represent only about 30% of the total. Methane emissions come from diverse sources and sectors of the economy, unevenly dispersed across the nation and not well tracked. That is why the EPA wants to begin to better quantify the emissions. There is little hard data on methane emissions; nonetheless, the estimates below are the best available and the Administration has used them to develop the current methane mitigation plan for the energy sector. You’ll note that the methane emissions from natural gas systems has decreased by about 15% from 2005 to 2012 despite the production of natural gas increasing by about 50% during that time period.

Over the last two hundred and fifty years, the concentration of methane in our atmosphere has increased by 151% to 1.8 parts per million. Methane is the primary component in natural gas, methane is emitted to the atmosphere during the production, processing, storage, transmission, and distribution of natural gas and because gas is often found alongside petroleum which is often much more valuable, methane is sometimes vented to the atmosphere rather than captured during oil production. Methane is also produced from the decomposition of human and animal waste as well as garbage and is the major component of landfill gas. Methane is also released from the natural biological process of enteric fermentation which is fermentation that takes place in the digestive systems of animals. In particular, ruminant animals that have two stomachs and eat grasses (cattle, buffalo, sheep, goats, and camels) produce and release methane by “passing gas” from the microbial fermentation that breaks down the grass and hay into soluble products that can be utilized by the animal. To significantly reduce the methane released from enertic fermentation it might be necessary to reduce the cattle and sheep population and the share of the American diet that is beef, lamb and dairy products. Finally, when natural gas and other petroleum products are used as a fuel incomplete combustion releases traces of methane.

According to the Intergovernmental Panel on Climate Change (IPCC), methane is more than 20-25 times more effective as CO2 at trapping heat in the atmosphere. So eventhough it is a much smaller component of the atmosphere, controlling methane emissions is essential to the Administrations plans to address climate change, though unfortunately “addressing” will not stop climate change. If you recall it is the greenhouse effect that is expected to increase the sensitivity of the climate to carbon dioxide, methane and the other greenhouse gases. According to climate scientists we have passed the tipping point and there is no stopping the climate trajectory predicted by the models that have been developed to understand and predict the climate of earth. However, detecting and reducing gas leaks are critical not only for reducing greenhouse gas emissions, but also for improving air quality and consumer safety, and saving consumers money.
from EPA

Thursday, January 15, 2015

Dominion Power Lines through the Rural Crescent

Fighting to protect the Rural Crescent and our way of life in Western Prince William County is a never ending battle. The latest threat: a new power line controversy has emerged. This time Dominion is proposing a 230 kilovolt double circuit transmission line that would start at an existing power facility southeast of the Interstate 66-Prince William County Parkway intersection in Gainesville and go west 6 miles before turning north, and traveling west of route 15 and cutting a path through the rural crescent for about 12 miles or so. The proposed transmission line would be run on steel poles, with an average height of 110 feet, and require 100-120 foot wide path for the right of way, according to information available from Dominion Power. A Town Hall meeting at Battlefield High School on Monday evening drew about a thousand people. 

The community is up in arms about running the above adding this level of infrastructure in the form of ground transmission lines through the bucolic Rural Crescent. This will not only damage the beauty and integrity of the Rural Crescent, but we will have to pay for the infrastructure in our monthly power bills. According to the Prince William County Planning website, these power lines are for a data center at 15505 John Marshall Highway in Haymarket called "Midwood Center" and it is rumored to be built for Amazon. Data centers are the massive banks of computer servers that hold the internet together. Data centers create and operate the “cloud,” specifically, Amazon and others lease computing capacity. These facilities are primarily powered from the grid, but generators and batteries are always necessary to provide backup power if the grid goes down.

Though we do not generally think of it that way, a data center is an industrial use, not a commercial use in its need for square footage and power with a very large carbon footprint, diesel generators and fuel storage tanks. The grass roots group Protect PWC should not only be concerned about the appearance of the power towers, they should also be concerned about what this will ultimately do to our power costs. We all get to pay for the infrastructure in our monthly electric bills. Data centers may be industrial in their energy and environmental footprint, but they employ very few people and pay only limited taxes.

Amazon already runs data centers in Northern Virginia. Reportedly there are eight. In Manassas alone Amazon has two data centers that are run out of three buildings that look like large warehouses with green, corrugated sides. Air ducts big enough to accommodate industrial cooling systems run along the rooftops; large diesel generators sit in rows around the outside to ensure an uninterrupted power supply and internet. In 2010, Amazon was fined hundreds of thousands of dollars by the Virginia Department of Environmental Quality for installing and operating diesel generators without the required permits. Great, so they are not even good neighbors.

Data centers, by design, consume vast amounts of energy. According to the New York Times data centers are typically run at maximum capacity around the clock, whatever the demand, so capacity is always available. As a result, data centers can waste the majority of the electricity they pull off the grid. The New York Times had the consulting firm McKinsey analyze energy use by data centers and found that, on average, they were using only 6 - 12 % of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their systems.

In 2013 data centers used 2.25% of the power generated in the United States, and data center power use is reportedly growing at 20% a year and expected to reach 153 billion kilowatt hours in 2020. Energy costs (and security) are the biggest decision factor in locating energy-hungry data centers. Virginia, with its low cost nuclear and coal powered electricity is an attractive location for a power hungry use. However, the arrival of the data centers combined with the new U.S. Environmental Protection Agency (EPA) regulations on electrical power plants, our cost of electricity will not be low for long.

In June 2014 EPA announced their Clean Power Plan; carbon emissions regulations for existing power plants. A state-specific compliance plan is due to the EPA for review and approval in June 2016. Virginia is currently at 1,438 lbs. of CO2 emitted per megawatt hour of electricity generated and we need to be at 991 lbs. of CO2 per megawatt hour in 2020 and 810 lbs. of CO2 per megawatt hour of electricity generated in 2030 to be in compliance with the EPA Clean Power Plan mandate. In the proposed regulations, 2012 is the actual baseline year chosen by the EPA to calculate the interim and final CO2 goals for each state. However, with Amazon building data centers that each use more power than thousands of homes, added since the base year, we will have to accommodate their demand in our plan.

The Virginia Department of Environmental Quality, the Virginia Department of Mines, Minerals and Energy, the State Corporation Commission, worked with a consulting team to determine whether Virginia could comply with the proposed EPA Clean Power Plan. They identified four scenarios that allowed Virginia to meet the 2020 goal of 991 lbs. of CO2 per megawatt hour. All four of the successful scenarios include major increases in the use of natural gas fueled electrical generation and a need for expansion of the existing natural gas pipeline network into the Commonwealth and reduction in coal generated capacity as well as significant increases in the amount of renewable energy in the power generation base.

Our existing coal fired power plants will have to be replaced. The simple truth is that coal emits 2,268 lbs. of CO2 per Megawatt hour while the natural gas fired turbines emits 903 lbs. of CO2 per Megawatt hour. As the EPA mandated cap in CO2 emissions assigned to us is forcing us to build new low CO2 power generation capacity, Amazon and other data centers reeves up the demand for electricity without creating more than a handful of jobs or significantly paying for the infrastructure necessary to provide that low CO2 electricity: the power lines, lower CO2 electrical generating capacity, the gas pipeline necessary to generate more power at lower CO2, the renewable no CO2 generation etc. Virginia gets to pay for that in increased power costs.

Monday, January 12, 2015

Keystone XL Pipeline: the Latest

On Friday, January 9 the Nebraska State Supreme Court ruled 4-3 in favor of landowners opposing the Keystone XL pipeline, but under Nebraska law, five judges are needed to declare a statute passed by the legislature and signed by the governor unconstitutional and therefore, the court said, the measure passed in 2012 “must stand by default.” The law under question was Nebraska law LB 1161 that Governor Heineman used to sign the January 2013 recommendation to the U.S. Department of State for a Presidential Permit for the Keystone XL pipeline to cross the international The Nebraska Department of Environmental Quality (DEQ) had recommended approval of the revised route for the pipeline that had been selected with their guidance

Under LB 1161 a pipeline carrier submits a route for evaluation by the Nebraska DEQ and receives the Governor's approval instead of obtaining approval from, the Public Service Commission, PSC, under the requirements of an older law called the Major Pipeline Siting Act (MOSPA) that was actually went into effect after the Keystone XL Pipeline application was submitted to the United States Department of State. The newer law LB 1161 was an amendment to MOPSA to exempt any major oil pipeline that had submitted an application to the United States Department of State under Executive Order 13337 prior to MOPSA's effective date. The only oil pipeline to fit within this exemption was the Keystone XL Pipeline.

The MOSPA process includes review by the Nebraska DEQ, Department of Natural Resources, Department of Revenue, and Department of Roads, the Game and Parks Commission, Nebraska Oil and Gas Conservation Commission, Nebraska State Historical Society, State Fire Marshal, and Board of Educational Lands and Funds and also requires the PSC to schedule a public hearing within 60 days of receiving an application. LB 1161 allowed the Keystone XL project to shorten this process for their revised route.

A Judge in a lower court had found that under the Nebraska’s State Constitution, exclusive regulatory control over pipelines like the Keystone XL must be exercised by the Nebraska Public Service Commission (PSC), and cannot be given to the Governor, and that LB 1161 must be declared unconstitutional and void. Though the Judge stated in her opinion that “such a declaration should not be misconstrued as an indictment of the work done by NDEQ in conducting the comprehensive evaluation required by LB 1161, or the conclusions reached by the Governor after reviewing NDEQ's Final Evaluation Report and approving the Keystone XL Pipeline route.”

The three Plaintiffs in the case are residents and taxpayers of the State of Nebraska. Each Plaintiff owns land or is the beneficiary of a trust holding land that was, or still is, in the path of one or more proposed pipeline routes for the Keystone XL Pipeline. The Defendants were the Governor, the Director of the Nebraska DEQ and the Nebraska State Treasurer. The Defendants argued that the Plaintiffs did not have standing to challenge LB 1161 and the district court lacks subject matter jurisdiction, and should have dismissed the case. The Nebraska Supreme Court had three of the panel judges conclude the landowners did not have legal standing to bring the lawsuit. As a result, they declined to address the larger constitutional question. Though four of the judges concluded the routing law “violates fundamental constitutional limits on government power in Nebraska,” a supermajority of five judges is required to declare a law unconstitutional and the law stands.

On April 18, 2012, TransCanada submitted a new "Initial Report Identifying Alternative and Preferred Corridors for Nebraska Reroute" to the Nebraska Department of Environmental Quality, NDEQ, for evaluation (paid for by TransCanada) of new route for the Keystone XL Pipeline project under the requirements of LB 1161. This application for a NDEQ review of a new route was followed in May 4th 2012 by a new application to the United States Department of State for a Presidential Permit to construct and operate the Keystone XL Pipeline.

On January 31, 2014, the U.S. Department of State released the Final Supplemental Environmental Impact Statement for the Keystone XL Pipeline. The executive summary states that Keystone XL is “unlikely to significantly impact the rate of extraction in the oil sands or the continued demand for heavy crude oil at refineries in the United States based on expected oil prices, oil-sands supply costs, transport costs and supply-demand scenarios.” In other words, no matter what action the Administration chooses to take on this portion of the pipeline-approve, reject, or stall- the oil sands are not staying in the ground in Canada.

The Department of State opened a 30 day comment period on February 5, 2014 then in April the administration announced that they would be delaying the decision until after the Nebraska Supreme Court decided the case. There has never been a timeline for making a decision, but the time has come to fish or cut bait. Secretary Kerry is empowered to make the final decision, though the White House has always controlled the process and now will be forced to make a decision because congress is taking action. Now a bill that authorizes the Keystone XL pipeline to cross the international boarder has passed the U.S. House of representatives 266-153 and is headed to the U.S. Senate where it is expected to pass, but does not have the majority necessary to override a Presidential veto. The bill should arrive on the President’s desk next week.


Though the President will veto the bill, it is just  a pipeline, the crude oil is not staying in the ground it will come by pipeline, boat, truck or rail road. As Marcia McNutt, the editor in chief of the AAAS journal Science stated in a recent editorial moving the Canadian crude by pipeline is the least environmentally damaging and safest method of transporting oil. There is currently a pipeline Keystone I that runs east from Hardesty Saskatchewan to Manitoba and then south through the Dakotas to Steel City, Nebraska. It is a less direct route and is a lower volume pipeline. Keystone II runs from Steel City to Cushing, Oklahoma at the Oklahoma storage facilities. Keystone III running from the Cushing Oklahoma to the Nederland, Texas began delivering crude oil from Cushing, OK, to the oil refineries in Texas on Wednesday, January 22, 2014. The Gulf Coast Project, Keystone III, did not require a Presidential Permit because it does not cross an international border.

Thursday, January 8, 2015

The Bay Report Card

The Chesapeake Bay Foundation's 2014 State of the Bay Report has been released. This report uses 13 indicators in three categories: pollution, habitat, and fisheries to offer an assessment of the health of the Chesapeake Bay. According to the report the overall health of the Chesapeake Bay is unchanged since 2012. The 2014 report notes improvements in dissolved oxygen, water clarity, oysters, and underwater grasses. Nitrogen, toxics, shad, resource lands, forested buffers, and wetlands were unchanged. Declines were seen in rockfish, and blue crabs and the phosphorus score. The 2014 phosphorus score dropped because annual phosphorus loads were higher in 2014 compared to 2012, particularly in the Potomac and James Rivers and on Maryland’s Eastern Shore home of all those poultry confined feed lots. According to a Maryland state study, each chicken generates approximately 0.41 lbs. of Nitrogen per year and around 0.35 pounds of phosphorus per year. (Human waste contains around 0.15 pounds of phosphorus per pound of nitrogen.)

The 2014 State of the Bay Report scores the health of the bay at 32 out of 100, a D+ according to the Chesapeake Bay Foundation. When I was in school anything below a 60 was an F. However, the Chesapeake Bay Foundation grades on a curve. The current goals of all the Environmental Protection Agency mandated Watershed Implementation Plans is a grade of 70, which would represent a saved Bay according to the Chesapeake Bay Foundation. If you recall the EPA mandated a contamination limit called the TMDL (total maximum daily load for nutrient contamination and sediment) to restore the Chesapeake Bay and its tributaries. The TMDL sets a total Chesapeake Bay watershed limits for nitrogen, phosphorus and sediment that were then partitioned to the various states and river basins. Each of the states and Washington DC were required to submit and have approved by the EPA a detailed plan of how they intend to achieve the pollution reduction goals assigned to them. These plans are called the Watershed Implementation Plans, WIPs, but the Chesapeake Bay Foundation refers to them as the “Clean Water Blueprint.”

In case you are wondering what constitutes a 100, the Chesapeake Bay Foundation says that the unspoiled Bay ecosystem described by Captain John Smith in the 1600s, with extensive forests and wetlands, clear water, abundant fish and oysters, and lush growths of submerged vegetation would rate a 100 on their scale.

Reducing pollution from agriculture is the goal of the Watershed Implementation Plans and the key to cleaning up the Chesapeake Bay. An often stated fact is that agriculture is the largest source of nitrogen, phosphorus, and sediment pollution in the Chesapeake Bay. What is usually not reported is that agriculture is the largest source of nutrient pollution because it is the largest active land use in the region not because agriculture is more polluting than other land uses. According to the Chesapeake Bay model, agricultural land represents almost twice the land as the developed areas. Bay-wide, agriculture is not on pace to meet the 2017 WIP benchmarks, though Virginia is on track thanks to a very successful implementation of a stream exclusion fencing program that has the state paying 100% of the cost of the fencing. However, urban and suburban stormwater runoff is heading in the wrong direction.

Farmers have made progress, especially in Virginia, but not enough. Reducing pollution from agriculture and converting acreage to stream buffers and restoring wetlands is the cheapest way to reduce pollution, and the states WIPs expect to get 75 % of their nitrogen, phosphorus and sediment pollution reductions from agriculture. Even with using the agricultural lands to achieve most of the pollution reductions, the costs of the Watershed Implementation plans are astronomical, about $13.6-$15.7 billion in Virginia alone.
from Chesapeake Bay Foundation report
With budget shortfalls throughout the region, it is important to maintain and expand the cost-share funding for agriculture and the budgets for the Conservation Districts that implement and monitor these programs. If we fail to meet the EPA mandated reductions in nitrogen, phosphorus and sediment pollution EPA has threatened to enforce reductions using litigation and enforcement actions to achieve the reductions using point sources (waste water treatment plants and stormwater permits). Using stormwater retrofits to achieve reductions would bankrupt the state.

Monday, January 5, 2015

In Praise of Nutrient Trading in Virginia

In mid-December Virginia Governor Terry McAuliffe, U.S. Environmental Protection Agency (EPA) Administrator Gina McCarthy and the Secretary of Agriculture all gathered in Fairfax County Virginia to applaud the expansion of the Virginia Nutrient Trading Program to meet the requirements of the U.S. EPA approved and mandated Watershed Implementation Plan. The nutrient trading program is an appealing, flexible and cost effective way to meet and maintain water quality goals. So, let’s back up and explain what is going on.
Volunteers in planting trees to reduce erosion along a stream


The Chesapeake Bay and its tidal waters have been impaired by the release of excess nitrogen, phosphorus and sediment. The EPA mandated a contamination limit called the TMDL (total maximum daily load for nutrient contamination and sediment) to restore the Chesapeake Bay and its tributaries. The TMDL sets a total Chesapeake Bay watershed limits for nitrogen, phosphorus and sediment that was about a 25% reduction from 2011 discharge levels for the six Chesapeake Bay watershed states and Washington DC. The pollution limits were then partitioned to the various states and river basins based on the Chesapeake Bay computer model and monitoring data. Each of the states and Washington DC were required to submit and have approved by the EPA a detailed plan of how they intend to achieve the pollution reduction goals assigned to them. These plans are called the Watershed Implementation Plans, WIPs. The Virginia WIP outlines a series of pollution control measures and strategies on how we are going to achieve and fund the pollution control necessary to meet the EPA mandate.

One of the key strategies was expansion of the Virginia’s successful nutrient trading program. Legislation passed in 2005 created the Chesapeake Bay Watershed Nutrient Credit Exchange Program and provides Virginia’s regulated pollution sources in the Bay watershed with the opportunity to meet required nutrient reductions through trading. The legislation also allows “point sources” like waste water treatment plants to purchase nutrient reductions from “nonpoint” sources like farms to offset new or increased nutrient discharges in excess of established load caps. Until recently the program had primarily been used by waste water treatment plants to offset the additional pollution loads from population growth. The Virginia nutrient trading program is based on the successful cap and trade program that was created to comply with the Clean Air Act’s Acid Rain Program limits for sulfur dioxide.

Virginia has managed to find other ways to utilize nutrient trading to reduce compliance costs for large point and non-point generators of nutrient contamination. The example cited by EPA Administrator McCarthy was the Virginia Department of Transportation (VDOT) who used banked pollution credits generated from farmers implementing Best Management Practices and riparian buffer stream bank plantings to off-set storm water pollution during road construction under increased Federal and State stormwater regulations that would have required building stormwater retention ponds and sediment filters for each construction section. VDOT did install permanent stormwater management infrastructure (using both traditional stormwater management and low impact strategies for the completed road, but using traded credits allowed them to avoid the wasteful building of temporary structures for the construction process yet reduce stormwater pollution on streams during construction.

Waste water treatment plants have predominately treaded among themselves. New expanded waste water treatment plans trade the excess credits that result for years after an expansion until the community “grows” into the plant, while those plants that have outgrown their facilities or need to meet more stringent standards by the credits. Some waste water treatment plants also created multi-year contracts with farmer to install nutrient reduction Best Management Practices during periods before expansion and improvement projects to meet tighter regulation or growth in the population served. There are critics of the program who oppose pollution trading because it allows polluters to buy their way out of controlling their pollution or restoring their degradation. The critics see only that entities are paying to pollute. However, with a growing population only a trading program can provide a framework to offset the inevitable additional pollution loads that come with more people.

Some critics are concerned about the potential for fraud or abuse. However, as you can see in the examples cited above this strategy allows for offsetting a short lived environmental impact without a huge and ultimately wasteful capital expenditure. There are limits to resources including capital, and a trading framework allows for cost effective temporary or longer term solutions. In the examples above in Virginia the permitted entity is required to verify and report the offset credits. Since most permitted facilities are VDOT, public waste water treatment plants and municipalities with storm water permits one hopes their veracity can be depended on to a greater extent and they have the ability to partner with Conservation Districts who have the expertise to evaluate the Best Management Practices in place.

Virginia’s Chesapeake Bay Watershed Nutrient Credit Exchange Program requires a level of Best Management Practice implementation called for in the nutrient tributary strategies to achieve nutrient reductions. You must achieve this level of nutrient reduction (known as the baseline) before you are allowed to generate and sell offsets to potential trading partners. Once the baseline level of nutrient reductions is achieved, additional reductions using approved Best Management Practices or land use conversions are eligible to generate offsets for trading. Cost share dollars can be used to implement the BMPs that achieve the baseline, which must be completed on the entire USDA Farm Services Agency tract before generating tradable credits. The program uses the incentive of earning additional dollars and cost share to further push all farmers to implement Best Management Practices on all their lands.

For a trading program to succeed there needs to be a regular and predictable demand for credits and a fairly straightforward and simple way to obtain the needed credits. Realistically the program will be limited to meeting the compliance needs of county and township stormwater permits, VDOT construction projects and if counties participate in facilitation and mandate the use then for construction projects large and small. The need for credits could be reasonably projected by county staff.

To allow for future population growth there might be a permanent demand for offsets by newer communities. The annual payments, maintenance of Best Management Practices or and use conversions and verifications could be funded by homeowner association fees. Every acre of development requires many more acres of supporting infrastructure development, schools, roads, shopping centers, churches, and public buildings. All this development increases runoff and additional nitrogen, phosphorus and sediment loads from sewer and septic systems and stormwater runoff from pavement and yards. Virginia has plans to “seed” the program to install some eligible credits in the Water Quality Improvement Fund for each watershed. In order for this to work the Conservation Districts in each watershed must have adequate funding, training and incentives. The dollars spent on these programs are the cheapest way to comply with the EPA mandate and cleanup our rivers and streams.


Full disclosure: In another part of my volunteer work I am a Director of the Prince William Soil and Water Conservation District. You should check out all that the Conservation District does at their web site.

Thursday, January 1, 2015

Depleting the Ogallala


The High Plains aquifer commonly known as the Ogallala aquifer (because the Ogallala formation makes up about three quarters of the aquifer) became news and burst into public awareness due to the protests associated with the Keystone XL Pipeline. Concern for the porous soils of the Sandhills and fears of a possible oil leak into the Ogallala aquifer is one of the reasons the route through Nebraska was changed. The High Plains aquifer underlies about 175,000 square miles in parts of Colorado, Kansas, Nebraska, New Mexico, Oklahoma, South Dakota, Texas, and Wyoming which make up one of the primary agricultural regions in the United States. There is a much bigger threat to the Ogallala; the aquifer is being depleted because the groundwater within it is predominately non-renewable. This groundwater aquifer is the primary source of water for the High Plains. This was open range land until the groundwater from the aquifer was used to turn the range land into irrigated crops. However, according to John Opie in “Ogallala: Water for a Dry Land” this is essentially fossil water that was generated 10,000-25,000 years ago by the melting of the glaciers of the Rockies.

The High Plains aquifer is the most intensively used aquifer in the United States and 97% of the water is used for irrigation. Groundwater withdrawals from the High Plains aquifer represent about 20% of all groundwater withdrawals within the United States and have turned the dry range land in the center of the country into the breadbasket of the world. There are only about 2.5 million people living on the 175,000 square miles of land covering the High Plains aquifer.

The High Plains aquifer is being depleted by irrigation. The grains we grow for consumption, ethanol and export are slowly depleting our water reserves. Farmers and ranchers began extensive use of groundwater for irrigation in the 1930s and 1940s. Estimated irrigated acreage was 2.1 million acres in 1949, 13.7 million acres in 1980, and 15.5 million acres in 2005 (U.S. Department of Agriculture).

About every 5 years, groundwater withdrawals for irrigation and other uses are compiled from water-use data and reported by the U.S. Geological Survey (USGS) and State agencies. Groundwater withdrawals from the High Plains aquifer for irrigation increased from 4 to 19 million acre-feet from 1949 to 1974; fell slightly and then reached 21 million acre-feet in 2000 and an estimated 19 million acre-feet in 2005 (USGS).

According to a recent report from the U.S. Geological Survey (USGS), total water stored within the aquifer was about 2.92 billion acre-feet in 2013 a decline of about 8% (or 266,7 million acre-feet) since irrigation began. Water stored within the aquifer fell 36 million acre-feet from 2011–13. The water depletion was uneven, the decline in Texas alone was 13,2 million acre-feet, while over the same period there was no change in water storage in South Dakota and Wyoming. According to Virginia McGuire the lead author of the report and a water scientist with the USGS, “The measurements made from 2011 to 2013 represent a large decline. This amount of aquifer depletion over a 2-year period is substantial and likely related to increased groundwater pumping.” This level of groundwater use is also unsustainable. We need to rethink our agricultural policies and environmental regulations in terms of sustainable water.