Thursday, August 30, 2012

Hantavirus-Clean Up and Mouse Proof the House

Image of Deer Mouse taken from CDC 

I read in the paper the other day that seven Yosemite visitors have recently been stricken with Hantavirus. Three died. Hantaviruses are found in the droppings, urine and saliva of infected deer mice. Hantavirus pulmonary syndrome (HPS), the illness caused by the virus, can take 3 to 60 days to develop after exposure. Symptoms include fever, headache, muscle aches, vomiting and diarrhea. The syndrome is fatal in 30%-40% of all cases. There is no vaccine, treatment or cure for HPS. Hantavirus is typically transmitted by breathing in particles in the air from the droppings, urine and saliva of infected rodents. However, there have been a small number of reported cases of HPS believed to have been contracted through rodent bites. Since the virus was first identified in the United States in 1993, there have been 67 cases in California and 593 nationwide including along the Appalachian Trail possibly in Virginia. Rodents, themselves, neither get sick nor can they pass along the infection to other animals; however the Center for Disease Control, CDC, has identified the ability of Hantavirus to adapt to new rodent species. Although currently rare, HPS is potentially deadly and may be an emerging disease. Rodent control in and around the home (and Curry Village) remains the primary strategy for preventing Hantavirus infection.

Knowing about Hantavirus, I was distressed and disgusted upon returning from an extended trip in the first year we owned our home to discover mouse droppings in the pantry and utility room. My husband took care of the capture and removal and I took care of the safe cleanup and mouse proofing the house, while the cat provided monitoring, patrolling the house at night. The only comments I have on mouse capture is that peanut butter and walnuts are excellent bait for either the capture and release traps or the spring traps. It appears that mice love nuts and the ruined baking supplies provided excellent bait for my husband’s trapping. It took a while, but eventually the husband was able to rid the house of mice. Though at one point it seemed he would dump the mouse in the woods and it would sneak right back in through the gas pipe in the fireplace. Eventually, I managed to seal up all likely entries and we have been mouse free for a while, but annual maintenance is necessary to keep mice out of the house.

A mouse can fit through the narrowest gap, seemingly flattening themselves to crawl into the house. According to the Center for Disease Control a gap of a quarter of an inch or a hole the size of a pencil eraser is large enough for a mouse to enter. A systematic approach is best for sealing all entry points. First of all, there is no way to prevent mice from getting into the garage because garage doors just do not seal that tight in their tracks. Instead, it is necessary to keep all nesting material and clutter out of the garage and seal all entries to the house. If you keep your trash cans in the garage make sure that the can(s) has a tight lid and no holes in the can. The garage turned out to be an area of entry into our house. Because of a sloping lot that gives me a daylight basement, the top of the foundation is about 12 inches above the garage floor. A compressed layer of insulation had allowed the mice entry into the basement. Steel wool and lath screening was pushed into every crack, the area caulked and thanks to Larry Reed, carpenter extraordinaire, the garage was finished, trimmed and sealed. New weather stripping was placed on every exterior door. Lath screen was cut to fit around all the kitchen pipes, the dryer vent pipe, the gas pipe to the fireplace the pilot light and valve to the fireplace.  The space between the foundation and siding was carefully caulked and sealed. Attic vents were screened. Windows were caulked and weather stripping on the windows checked. All exterior  holes for electrical, plumbing, and gas lines were carefully sealed with Duxseal.

Do not sweep or vacuum up mouse urine, droppings, or nests. This will cause virus particles to go into the air, where they can be breathed in. To clean up the mouse dropping and my pantry, and utility room I first geared up. According to OSHA and the CDC if  there is not a heavy accumulation of droppings you need only  wear disposable protective clothing and gloves (neoprene, nitrile or latex-free), rubber boots and a disposable N95 respiratorto safely clean up rodent droppings. A N95 disposable respirator is just one of those white dust masks (look for the certification number and yes, I keep them in the house), I wore my rain boots and some old work cloths that I threw out afterwards. The first thing I did was throw out all the food  in the pantry and sprayed the shelf paper with disinfectant. Make sure you get the urine and droppings very wet. Let it soak for 5 minutes and then use paper towel to wipe up the urine and droppings and throw the paper towels into a plastic bag and seal carefully. I also removed the shelf paper and then cleaned the pantry again using disinfectant.

In the utility room I sprayed the floor and all flat surfaces including furnace ducts, modem shelf and top of the furnace, hot water heater and pressure tank with disinfectant, let it soak in for five minutes then used paper towels to wipe up all the droppings and then used a disposable swifter wet mop to mop the room. The basement was never cleaner. After cleanup is complete and all paper towels and swifter pads sealed in plastic bags, wash you gloved hands and boots with spray a disinfectant or a bleach solution before taking the gloves and boots off. Then throw the gloves out along with the clothes.  Wash hands with soap and warm water after taking off your gloves and take a nice hot shower before going to the store to buy new shelf paper and pantry staples. While at the store buy heavy plastic canisters to store grains, and baking supplies. 

Monday, August 27, 2012

The History of Drinking Water in Washington DC


In the first decade of the nineteenth century a group of residents of Washington DC were granted permission to pipe water from the city spring to their neighborhood in the 600 block of Pennsylvania Avenue. Shortly thereafter the city built a pipe to convey water from a city spring to the northwestern Pennsylvania Avenue vicinity, between 9th and 14th streets. These, were the first instances of water deliveries in Washington DC and the beginning of the water system in our nation’s capital. The city-wide delivery of fresh water was still another fifty years away and would arrive with the Washington Aqueduct.

The original portions of the Washington Aqueduct were planned and built by Lieutenant Montgomery C. Meigs of the Army Corp of Engineers. The Dalecarlia Reservoir was completed in 1858 and water first reached the District through the Washington Aqueduct system on January 3, 1859. Initially the reservoir provided water to the city from the adjacent Little Falls Branch, but this soon was inadequate and flow from the Potomac River was added in 1864. At that time the city government believed, the Washington Aqueduct system would be sufficient for all the future water needs of the city. Today the Washington Aqueduct is a division of the Baltimore District, U.S. Army Corps of Engineers. The Aqueduct is a federally owned and operated public water supply agency that produces an average of 180 million gallons of water per day at two treatment plants located in Washington DC and sells the water to the District of Columbia, Arlington County, Virginia, and the City of Falls Church, Virginia.

 After the initial construction of the Washington Aqueduct the surge in population of Washington DC during the Civil War, quickly created a human waste problem in the city and there were epidemics of smallpox, malaria, and typhoid from human waste contaminating the water supply which took many thousands of lives during the war years. Dr. John Snow had discovered and proved the connection between cholera and contaminated water during the 1850’s in London, England. Nonetheless, the general belief was that if water looked, tasted and smelled fine it was good and though the Potomac River provided dilution disease survived. The Washington Aqueduct was originally built as a water transportation system, to bring the river water into the city. However, in 1895 the flow from Little Falls Branch was diverted away from the Dalecarlia Reservoir to prevent disease in conjunction with development of the sanitary sewer system. Despite these steps it was clear that, the Washington Aqueduct needed to be expanded and have a filtration system. The Washington Reservoir, which is now called the McMillan Reservoir, was built in 1902 to increase supply and in 1905, a 75 million gallon per day slow-sand filtration system was added at that reservoir and the Bryant Street high-lift pumping station was built.

After World War I an 80 million gallon per day rapid-sand filter was added at the Dalecarlia Reservoir to address the problems created by continued population growth and the sheer amount of raw sewage that was being pumped into the river. Primary waste treatment began for Washington DC at Blue Plains sewage treatment plant in 1937. The continuous population growth of Washington DC during World War II made it necessary to continue to expand and improve the water supply system. In February 1946, Congress approved comprehensive plans from the Army Corp of Engineers and the City Engineer to construct, improve and add to the existing water system. For more than thirty years, implementation of the plan underwent periodic modifications through changing requirements and increases in necessary funding by Congress. This awkward and inefficient oversight and funding was still in effect when I briefly lived in the District in the early 1970s when the water and sewage agency was known as the District of Columbia Department of Environmental Services. Later, in 1985, the District Government established a new Department of Public Works, of which the Water and Sewer Administration was a part of until 1996.

In 1996, the District Government initiated the creation of the District of Columbia Water and Sewer Authority (DC WASA re-branded DC Water in 2010), an independent authority of the District of Columbia providing water delivery and sewage services to the region. On April 18, 1996, following a 30-day Congressional review period, the District Council enacted DC Law 11-111, "The Water and Sewer Authority Establishment and Department of Public Works Reorganization Act of 1996." This allowed DC WASA to have a separate and dedicated source of funding-water and sewer rates. It was envisioned that DC WASA would then be able to use that funding to meet its statutory obligation to provide sanitary sewer services and deliver potable water to the Washington Metropolitan Area. The Washington Aqueduct remains   federally owned.

Today, the Aqueduct draws water from the Potomac River at the Great Falls and Little Falls intakes and treats the water at two treatment plants, Dalecarlia and McMillan. The Aqueduct filters and disinfects water from the Potomac River to meet current safe drinking water standards. The treatment process includes sedimentation, filtration, fluoridation, pH adjustment, primary disinfection using free chlorine, secondary disinfection with chloramine through the addition of ammonia, and corrosion control with orthophosphate. The EPA sets national limits on residual disinfectant levels in drinking water to reduce the risk of exposure to disinfection byproducts formed when public water systems add chemical disinfectant for either primary or residual treatment. These levels are known as Maximum Residual Disinfectant Levels (MRDLs). The EPA also sets EPA sets limits on the contaminants regulated under the Safe Drinking Water Act to ensure that the water is safe for human consumption. These limits are known as Maximum Contaminant Levels (MCLs). During calendar year 2011, no MRDL nor any MCL violations occurred in the Washington Aqueduct system.

By 1996 some portions of the water delivery system were 100 years old and the sewage system was almost the same age. The water and sewage rates in place in the Washington Metropolitan Area covered the costs to deliver the water and treat the sewage and replace 0.33% of the system each year, an unrealistic and irresponsible repair and replacement rate.  DC Water averages between 400 and 500 water main breaks per year, and they thought that a plan to replace the system over a 300 year time span was meeting their statutory obligations. 

Water delivery systems have a long life span, they are just pipes, pumps and valves, but the life span is not infinite. We reward short sighted behavior. In order to have cheaper water and sewer, a replacement cost schedule was not built into the customer rates for the past 78 years which coincidentally is the current average age of a water main in Washington DC. There are water pipes north of the White House that are reported to have been laid before the Civil War. This past Spring DC Water announced that they have tripled the replacement rate to 1% (with of course the increase in water rates) so that in 100 years the system will be replaced. Sewage rates were increased to finance the District’s portion of the $7.8 billion Blue Plains improvement program called the Clean Rivers Project that will meet the reduced total nitrogen released requirements of their operating permits and increase the control of the system during rain storms in addition sludge treatment will be improved and sewer piping improved in many areas. In truth, according to an interview with the General Manager, George Hawkins on National Public Radio, DC Water has gotten so far behind that they cannot to catch up- it will take decades. It is likely, given the age of the water system in Washington DC the increase in replacement rate was probably necessary to address what was failing each year. One hundred years is longer than the predicated life of a water distribution system, piping systems are rated at 80 years and the average water main in Washington DC is 78 years old. The water pipes in DC are old. They leak. DC Water is trying to use a predictive modeling to determine which pipes need to replace first to keep the good quality* water they are buying from the Washington Aqueduct flowing to the homes and businesses in the District.

Washington DC (and most of America) has always thought about the cost of water wrong, there should have always been a plan for maintenance, upgrade and replacement of the system and the care and protection the water resources; instead we have all taken water (and sewage) for granted. Every pipe should have been on a schedule to be replaced before it exceeded its life and broke. The water rates need to cover these capital replacement and maintenance costs. If we do not maintain our infrastructure we will not have on demand water. DC Water sees persuading customers to pay for the maintenance and improvement of the water and sewer system as their biggest challenge. The investment into water and sewer infrastructure is simply one of the best investments that any community can make.

* Dr. Marc Edwards a professor of engineering at Virginia Tech discovered while doing research in the mid-1990s to identify the cause of an increasing incidence of pinhole leaks in copper water pipes, that chloramine was causing the accelerated pipe deterioration and extreme lead concentrations in DC drinking water. Chloramine-treated water picks up lead from pipes and solder and does not release it, resulting in elevated levels and deterioration of the pipes. The change to chloramine was made after the EPA issued regulations concerning disinfection by-products formed when chlorine reacts with organic matter in drinking water; the EPA considered these byproducts to be a potential health threat. Chloramines do not produce disinfection byproducts. The lead problem was addressed in 2004 by the Washington Aqueduct adding additional treatment steps to the water to prevent the chloramine from dissolving lead in the water mains, solder joints, and fixtures. In addition, DC WASA spent $97 million to replace a portion of 15,000 pipes and 2,000 full pipe replacements. Then after the dust settled on this re-branded themselves as DC Water. 

Thursday, August 23, 2012

EPA Rule Targeting Coal Power Plants Voided by Court

On Tuesday the U.S. Court of Appeals for the District of Columbia ruled (2-1) that the Cross State Air Pollution Rule, CSAPR, exceeded the U.S. Environmental Protection Agency’s authority by requiring some state to clean up more than their fair share of pollution. CSAPR defined each State’s emissions reduction goals and the Federal Implementation Plans to obtain those goals at the State level. However, the EPA had used computer modeling to generate emissions “budgets” for each upwind State without regard for the amount of pollution each state was contributing to a downwind problem, but based instead on the cost of remediation. The EPA was requiring the level of cleanup to be based on cost and requiring more work to be done where the cost of capturing a ton of sulfur-dioxide and nitrogen-oxide was the lowest creating a pollution trading system.

CSAPR, requiring reductions of sulfur-dioxide and nitrogen-oxide emissions in coal fired plants, was intended to have gone into effect on January 1, 2012, but the U.S. Court of Appeals District of Columbia Circuit granted a stay to the implementation of the CSAPR pending resolution of the legal challenges. Now the Court of Appeals has found the rule exceeded EPA authority. CSAPR, if it had been implemented would have reduce SO2 emissions by 73% from 2005 levels and NOx emissions by 54% at the approximately 1,000 coal fired electrical generation plants in the eastern half of the country. This rule was intended to help downwind states attain the 24-Hour and/or Annual PM2.5 National Ambient Air Quality Standards (NAAQS) and the 1997 8-Hour Ozone NAAQS. CSAPR would have replace EPA's 2005 Clean Air Interstate Rule (CAIR) which will now remain in effect. Both these rules are intended to allow states to better control their particulate pollution.

According to the Lung Association, the two biggest air pollution threats in the United States are ozone and particle pollution. Other pollutants include carbon monoxide, lead, nitrogen dioxide, sulfur dioxide and a variety of toxic substances including mercury that appear in smaller quantities. The EPA requires states to monitor air pollution under the NAAQS to assess the healthfulness of air quality and ensure that they meet minimum air quality standards. One standard of NAAQS is particulate pollution of 2.5 micrometers or less called PM2.5. Combustion engines and coal burning power plants are key contributors to PM2.5 particles, and according to the US EPA and World Health Organization, the smaller, finer pollutants measured by PM2.5 are especially dangerous for human health. Studies have shown that people are at increased risk of asthma, lung cancer, cardiovascular problems, birth defects and premature death from particles smaller than 2.5 microns in diameter that lodge deep in the lungs.

CASPR was intended to prevent pollution from one state from moving into other states and preventing them from meeting their air quality goals. Several states have been unable to meet the current particulate standard. PM2.5 particles can be either directly emitted or formed via atmospheric reactions. Primary particles are emitted from cars, trucks, and heavy equipment, as well as residential wood combustion, forest fires, and agricultural waste burning. The main components of secondary particulate matter are formed when pollutants like NOx and SO2 react in the atmosphere to form particles.

Currently, under the Clean Air Act the US EPA has established both annual and 24-hour PM2.5 air quality standards (as well as standards for other pollutants). The annual standard is 15 ug/m3 (an air quality index, AQI of 49). The 24-hr standard is 35 ug/m3 (an AQI of 99). In June of 2012 EPA announced that they are proposing stricter air quality particulate standards to go into effect in December 2012. The standard is anticipated to be 12-13 ug/m3. According to American Lung Association State of the Air Report, Pittsburgh, PA had the highest particle pollution in the nation on an annual basis. Seven cities averaged particulate levels higher than the 15 ug/m3 current standard allows: Bakersfield, CA; Hanford, CA; Los Angeles, CA; Visalia, CA; Fresno, CA; Pittsburgh, PA; and Phoenix, AZ. The American Lung Association in their latest report states that twenty cities actually have average year-round particle pollution below the current regulated level, but above the proposed EPA air quality standard of 12-13 ug/m3. The maximum 24 hour standard will remain unchanged at 35 ug/m3. While particulate pollution remains a problem the EPA has not been able to address the problem by regulation targeted at coal fired power plants.

The earth’s atmosphere is interconnected. That is accepted when it comes to carbon dioxide, but it also applies to industrial pollutants and soot. The EPA has estimated that just one-quarter of U.S. measured pollution emissions from coal-burning power plants are deposited within the contiguous U.S. The remainder enters the global cycle. Conversely, current estimates are that less than half of all measured coal pollution emissions deposited within the United States comes from American sources. According to the Mount Bachelor Observatory, Chinese exports include acid rain that falls in China, Korea, and Japan, and pollutants that enter the air stream including sulfates, NOx, black carbon, soot produced by cars, stoves, factories, and crop burning.

Monday, August 20, 2012

Recharging Groundwater

From John Hopkins University website
“The increasing use of ground water for industrial, municipal, and irrigation supply in the United States has emphasized the need for recharging the ground water in many areas by artificial means. Although the practice of artificial recharge is not widespread in the eastern half of the United States, it has been important in southern California for water conservation and flood control since about 1895.” This statement was written by David K. Todd in 1956 as the opening lines of a report titled: ANNOTATED BIBLIOGRAPHY ON ARTIFICIAL RECHARGE OF GROUND WATER THROUGH 1954.

Clearly, artificial recharge of groundwater is not a new idea and during the second half of the 20th century it became more common to use structures such as basins and pits to increase recharge or infiltration of ground water or injection of surface water into aquifers. Overdrawing the groundwater aquifers in the coastal regions of our country is most easily seen by the intrusion of salt water. Overdrawing a groundwater aquifer can have many negative impacts on the regional hydraulic balance- a reduction in discharge to surface water at some other location, an increase in recharge from surface water,  a loss of storage in the aquifer by falling water table or a loss of an aquifer by salt water intrusion or some combination of these effects. To remain a renewable resource the amount of groundwater removed from an aquifer needs to match the recharge rate. What we consume much be replaced.

As our cities have spread out the development, characterized by pavement, buildings, and other impervious surfaces prevents the infiltration of precipitation that occurred before development while the increased population increases the demand for water. Changing the recharge rate by reducing open areas or increasing water velocity through pavement can change the entire water balance and ecology of a region. In some areas of the country (and world), groundwater currently being used entered the aquifer a millennia ago when the climate in that area was wetter. That water is not being replaced under these climate conditions and the supply may ultimately be exhaused unless artificially recharged. The amount of groundwater removed from an aquifer needs to be sustainable, matching the recharge rate whether that recharge rate is natural or artificially enhanced.

These days recharge of groundwater through spreading basins, pits, and injection or drainage wells is more widely practiced and will likely increase in areas of limited rainfall. There are many challenges to recharging groundwater. The first is geologic. Except for recharge using injection wells directly into an aquifer, artificially recharged water must first move through the unsaturated zone. Characterization of the soils and geology is essential to determining the viability of an artificial recharge project. Areas where ground subsidence caused by excess withdraws of groundwater from the fine-grained compressible confining beds of sediments cannot be recharged. In addition, geological characteristics such as faults with significant offset, folds, and extensive coarse- or fine-grained sedimentary geological units can control both groundwater flow and the fate of water from artificial recharge. Groundwater is not an underground bathtub of water, the site specific geology will determine the ability to recharge the aquifer and where that recharge must take place.

Artificial recharge of a groundwater basin can be used to store surface water when supplies are plentiful as a reservoir for dry periods when water is less available. Storing the water in the underground aquifer is generally considered less environmentally damaging than dam and reservoir construction, and underground storage significantly reduces water loss from evaporation. Recharging an aquifer has lower capital costs than dam and reservoir construction, but require similar distribution networks and pumping costs tend to be higher. Monitoring water availability and allocations and water rights of overlying landowners complicate the water ownership and allocation. In California “The Water Plan” imports several million acre feet of water from northern to southern California each year. A significant portion of the imported water is stored in the groundwater aquifer by using artificial recharge, but artificial recharge is also used to stretch the water supply which could allow introduction of contaminants into the aquifer.

For 30 years Los Angeles County has recycled the water from wastewater treatments plants. This water from both secondary and tertiary treated wastewater is discharged into spreading basins to recharge groundwater. Groundwater recharge can be done by surface spreading or direct injection wells. California guidelines recommend spreading over injection because of concerns about water quality and potential health hazards. It has long been know that soil filtration improves water quality and soil column studies with secondary effluent from wastewater treatment has shown dissolved organic carbon (DOC) removal of 56% for sandy loam, 48% for sand and 44% for silty sand, with most sites removing 48% of DOC by percolation through 20 feet of soil. Greater depths of soil (80 feet) reduced DOC by 92%. This is how septic systems work and how nature filters groundwater. However, prior to discharge, wastewater is heavily chlorinated and subsequently dechlorinated, but because of high DOC concentrations in wastewater, high concentrations of disinfection by products (DBP) are created.

In 1996 the concentration of the precursors of disinfection by products such as total organic halide (TOX) and trihalomethanes (THMs) were studied in the groundwater basin and it was discovered that these precursors of disinfection byproducts in reclaimed water were not removed by percolation through the soil. Total organic halide removal was only 17%. (Fate of Disinfection By-products in the Subsurface by Colleen Rostad, U.S. Geological Survey.) The quantity and type of DBPs, varies not only by water quality and disinfection conditions, but also by properties of the organic molecules that make up the dissolved organic carbon. Studies in Los Angeles County have found that the precursors of disinfection byproducts in reclaimed water are not rapidly removed by soil percolation. As our need to recycle water expands we are potentially introducing TOX and THMs and many other contaminants into our groundwater aquifers. As coastal cities need to recycle more water and use it to recharge the groundwater aquifer to maintain the supply of available water, we need to better understand what contaminants (and emerging contaminants) are carried in the wastewater and survive soil percolation. The groundwater aquifer serves to dilute the wastewater contaminants that survive soil percolation, but we need to be honest and informed about what we are putting into or leaving in what is ultimately our drinking water supply. An interesting note is that research at Johns Hopkins University seems to indicate that groundwater recharge using soil filtration of wastewater treatment plant effluents may be an effective method of removing trace Pharmaceuticals and personal care products from the water. So recharging groundwater may be preferred over releasing effluent to rivers for downstream reuse.

Thursday, August 16, 2012

Hetch Hechy Valley Restoration Plan on Ballot for November

Picture from SF PUC web site
The Hetch Hetchy Valley is a glacier-carved granite canyon located in Yosemite National Park. Though once described by John Muir as a smaller version of the Yosemite Valley it is currently dammed and used as a reservoir and hydroelectric generator by the City of San Francisco. After obtaining more than 15,000 signatures, a group whose ultimate goal is to restore the Hetch Hetchy Valley has put an initiative on November’s ballot. The initiative would require the city to create a new master plan for their water system based on draining the reservoir and returning Hetch Hetchy to the national park service.

The initiative requires that the water plan include water recycling, water reclamation, conservation, improved storm water capture and increased development, including recharge capability, of groundwater sources and replacing the hydropower from the loss of the dam with solar and wind renewable sources of power. In addition, the plan would have to develop other sources of water supply because all the studies cited on both sides of the argument indicate that the water supply will fall short.

The Hetch Hechy system consist of much more than the reservoir created by the O’Shaughnessy Dam. The system consists of Hetch Hechy Reservoir, Cherry Reservoir and Eleanor Reservoir a portion of the New Don Pedro Reservoir as well as other small surface reservoirs of various size and significance and five groundwater basins.  The stored water in the system is transported to the cities, towns and farms supplied by the system via the Hetch Hetchy Aqueduct, the California Aqueduct, the Delta Mendota Canal, the South Bay Aqueduct, and the Pacheco Tunnel. 
Taken from BAWSCA
Now developing a plan for removing O’Shaughnessy Dam to restore Hetch Hetchy Valley is on the ballot. While many will dismiss the idea as preposterous, enough studies have been done and computer models created that it is fairly certain that the idea is only extremely costly and will reduce somewhat the reliability of the water supply. Billions upon billions of dollars depending on how certain you want your water supply to be, and remember these are costs that would have to be borne by San Francisco water rate payers not the state taxpayers or national taxpayers.

First of all, the inescapable regulatory costs- these costs are certain. While O’Shaughnessy Dam has storage capacity of 360 thousand acre-feet (taf) and the Hetch Hetchy system stores 2,000 taf of water a year, its value is more than that modest amount of water storage. The O’Shaughnessy Dam and Hetch Hechy reservoir provides water storage, hydropower generation, and some flood control, but primarily its value is that water from O’Shaughnessy Dam has filtration avoidance status. Without O’Shaughnessy Dam, the San Francisco Public Utilities System, SFPUC, would have to build a water treatment plant for filtration. New York is currently completing the construction of a new filtration facility to treat the water from the Croton Water System. Though it was originally estimated to cost $950 million, to date its cost has been $2.8 billion. This would translate to a cost of $5-$6 billion to build a filtration plant for SFPUC. Then on top of the construction costs and interest payments, there are the operating costs. Estimates have ranged from $13 million (estimated in 2000) to $20 million annually.

In addition, to replace the storage various solutions have been suggested the most obvious one being increasing the height of the Don Pedro Reservoir to increase its storage. This would require a major construction project. I could find neither feasibility studies nor estimates on the costs associated with that project- the dam cost about $100 million in 1967. Would it cost five times, ten times to expand it now? There is very little recent dam expansion or building data to use to estimate the cost.  However, to effectively replace the Hetch Hetchy storage an inter-tie linking the New Don Pedro Reservoir to the Hetch Hetchy Aqueduct would be necessary to avoid releasing the water through the Delta (and we know how that works out). It would probably cost a couple of billion or more to take care of those two items.

The ballot measure calls for including water recycling, water reclamation, and improved storm water capture in the plan. The reason is there is virtually no other way to obtain enough water for the Bay Area without it. It should be noted that at some time in the future these water reuse strategies will be necessary if the population of the Bay Area continues to grow. There are 993 miles of combined sewers in San Francisco, which collect sanitary sewage from toilets and drains in apartments, homes, schools, offices and other businesses, and street runoff throughout the city. The Water Sustainability and Environmental Restoration Plan would require rebuilding the sewer system to separate the storm water system from the sanitary sewer system and improving the water treatment and recovery of the wastewater treatment plants in San Francisco.

The two major waste water treatment plants that serve the city of San Francisco are the Southeast Water Pollution Control Plant in Bayview built in 1950’s and the Oceanside Water Pollution Control Plant built in 1993. On a dry day the San Francisco combined sewage system treats about 80,000,000 gallons a day of sanitary sewage.   The San Francisco Wastewater Treatment plants are not designed to do any more than screen out trash, skim off scum and grease and use bacterial action to digest toilet paper and bio-solids. Pharmaceuticals, pesticides, hydrocarbons and anything else the residents can think to pour down the drain or dump into the storm drains will either clog the system or be released into the Bay or Ocean. If we are to reuse this water additional treatment would be necessary. Pharmaceuticals, pesticides and other chemicals pass through the system untreated at this time additional treatment and control would be necessary to reuse this water. The cost of these systems would be dependent on the level of treatment desired to drink that water. Washington DC is currently engaged in a plan to upgrade their Blue Plains Wastewater Treatment system that discharges to the Potomac their program is estimated to cost $7.8 billion.

CALVIN (CALifornia Value Integrated Network) is an economic-engineering optimization model of California’s inter-tied water management system. It was developed by Jenkins et al. at the University of California, Davis and has been used to make many of the estimates for feasibility for removal of the O’Shaughnessy Dam and restoration of the Hetch Hetchy valley. CALVIN uses 72 years of monthly historical data to represent future hydrology. They use data from water year 1922 to water year 1993. This period includes the droughts of, 1929-1934, 1976-1977, and 1987-1992, which were some of the worst on record. However, this period of time may not be representative of the severity and duration of droughts and what rainfall will look like in the next decades. Australia’s experience with drought duration last decade should have taught us that.  The driest year in the data was 1924 when the state average rainfall was only 10.50 inches, but in that year the San Francisco Bay region was drier than the rest of the state. The future might hold a worse or longer drought or a regional drought.

An important limitation with CALVIN is perfect foresight; CALVIN knows when the droughts are going to happen and how long the droughts will be. This allows the model to be proactive in preparation for droughts, reducing water scarcity and associated costs. In addition, the model assumes that unlimited amounts of water can be purchased from others (farmers) to make up any shortfall and that there exists a pipe linking the New Don Pedro Reservoir to the Hetch Hetchy Aquitard.

This ballot measure is for the development of the Water Sustainability and Environmental Restoration Plan. This plan has the potential to be forward looking and develop an integrated and robust design for the water and wastewater systems of the Bay Area for the next century. The cost of developing this plan is limited to 0.5% of funds previously authorized by voters for the Water System Improvement Program, WSIP. That translates to $23 million dollars of the $4.6 billion WSIP just to develop a plan that must be completed by November 2015 “in time for the San Francisco Board of Supervisors or a group of citizens to propose a charter amendment to be voted on at the November 2016 election, which if passed would authorize implementation of the plan.”

Even this ballot measure is a significant decision for the residents of San Francisco, and about much more than restoring the Hetch Hetchy Valley. This is a first step on the road towards rebuilding the entire, water storage, treatment and sewer system for the Bay Area  as well as replacing the hydroelectric power from O’Shaughnessy Dam with other (hopefully) renewable power. The full implementation of the plan could easily cost San Francisco rate payers $10-$20 billion for infrastructure. I assume that funds to restore the Hetch Hetchy Valley would come from donations or the federal park service. The $23 million that the ballot measure authorizes for the study may not be enough to develop a fully integrated and workable plan, for transport, sewers, water treatment, water recycling and power generation let alone hold the necessary public hearings and outreach in San Francisco.    

Monday, August 13, 2012

Chevron Fire and Air Quality in the East Bay

On the evening of August 6, 2012 a fire broke out at the Chevron Refinery in Richmond, CA. The fire began in a crude oil unit, where there was a leak and was contained that evening, but a small controlled burn was allowed to reduce pressure in the system. Area residents were told to stay inside with windows closed for more than five hours before the all clear was given by the Contra Costa County Health Department.

The Bay Area Air Quality Management District (Air District) responded to the emergency and took eight air samples during the fire. After analysis of the samples for 23 chemical compounds the Air District reported that they found only acrolein levels above the Reference Exposure Level, REL. The Air District noted that throughout the Bay Area acrolein levels are commonly above the REL (not exactly comforting). Reportedly, acrolein levels routinely range between 1 and 4.5 parts per billion. The Air District stated in their news release that “(t)hese concentrations were similar to the “background” levels measured throughout the Bay Area by our monitoring network.”

While this sounds comforting this report by the Air District can be misleading. First of all the data is incomplete and second there are three types of REL, Acute (1 hour exposure), 8 hour (which may be repeated) and chronic. When the Air District spoke of REL they were referring to the acute level, not normal everyday background levels. According to Cal EPA “The acute REL is an exposure that is not likely to cause adverse effects in a human population, including sensitive subgroups, exposed to that concentration for one hour on an intermittent basis.” These health based acute RELs apply to spills, leaks, or other discharges to the ambient air that results from routine operation of a facility and predictable process upsets or leaks.”

Cal EPA determined the concentration level at which no adverse health effects are anticipated for a short term. The acute RELs are based on the most sensitive populations reporting of adverse health effects in the medical and toxicological literature. RELs are designed to protect the most sensitive individuals in the population by the inclusion of margins of safety. Margins of safety are incorporated into the REL to address data gaps and uncertainties, exceeding the REL does not automatically indicate an adverse health impact. The acute REL was the “background” levels that the Air District referred to. More importantly, the data from the Air District is not yet complete. There was no measurement of particulates.

According to reports in the San Francisco Chronicle there was a huge cloud of smoke and hundreds of people went to the hospital to seek treatment. The most likely source of health impacts from the fire is particulate matter from smoke. The Air District does not yet have any data to release on particulate concentrations downwind from the Chevron refinery. For some reason the stationary continuous particulate monitors for the Air District in the local area are upwind of the refinery. The Air District says samples were taken in San Pablo, two miles from the Chevron facility following the fire and are being analyzed for particulate matter levels. Results are expected in the next two weeks, but there is only a small chance that the regularly scheduled monitoring time will have caught the smoke cloud.

Particulate matter is made up of particles that are emitted directly, such as soot and dust, as well as secondary particles that are formed in the atmosphere from reactions of precursor pollutants such as oxides of nitrogen (NOx), sulfur oxides (SOx), volatile organic compounds (VOCs), and ammonia (NH3). Particle are either directly emitted or formed in the atmosphere. Directly-emitted particles come from a variety of sources such as cars, trucks, buses, industrial facilities, power plants, construction sites, tilled fields, unpaved roads, stone crushing, and burning of wood. Other particles are formed indirectly when gases produced by fossil fuel combustion react with sunlight and water vapor. Many combustion sources, such as motor vehicles, power plants, and refineries both emit particles directly and emit precursor pollutants that form secondary particulates. Ammonium nitrate and ammonium sulfate are the principal components of secondary particulates.

Particulate matter has immediate health impacts: itchy, watery eyes, increased respiratory symptoms such as irritation of the airways, coughing or difficulty breathing and aggravated asthma. Health effects can result from both short-term and long-term exposure to particulate pollution. Exposure to particles can also trigger heart attacks and cause premature death in people with pre-existing cardiac or respiratory disease. People most sensitive to particulate pollution include infants and children, the elderly, and persons with existing heart and lung disease. The particles can travel deep into the lungs, enter the bloodstream, and penetrate into cells. Smaller particles can penetrate deepest, causing the greatest harm. Researchers are still trying to identify which types and sources of particles are most hazardous to human health. Though, particles created from combustion soot tend to be fine particles with diameters smaller than 2.5 microns (PM 2.5) which are the most dangerous because it lodges in the lungs. Dust is mostly coarser particles.

According to the Lung Association, the two biggest air pollution threats in the United States are ozone and particle pollution. The United States Environmental Protection Agency, U.S. EPA requires states to monitor air pollution to assess the healthfulness of air quality and ensure that they meet minimum air quality standards. The US EPA has established both annual and 24-hour PM2.5 air quality standards (as well as standards for other pollutants). The annual standard is currently 15 ug/m3 (an AQI of 49). The 24-hr standard was recently revised to a level of 35 ug/m3 (an AQI of 99). The EPA is proposing new air quality particulate standards to go into effect in December 2012. The proposed annual average particle pollution will be lowered to 12-13 ug/m3. The maximum 24 hour standard will remain unchanged at 35 ug/m3. The Bay Area Air District does not have a live feed for their particulate pollution levels like some regions of the state.

The Bay Area's cool, coastal climate safeguards the region against some factors that exacerbate pollution, such as high temperatures and stagnant air flow. Nonetheless, the Bay Area is a nonattainment area due to persistent smog. Santa Clara, Solano and Contra Costa counties had more than 12 high particle pollution days last year of record. A study of children in Southern California showed lung damage associated with long-term particulate exposure, and a multi-city study found decreased lung function in children associated with long term particulate exposure. Many scientific studies have linked exposure to excess particulates to aggravated asthma, increased respiratory symptoms like coughing and difficult or painful breathing, chronic bronchitis, decreased lung function, heart attack and premature death. According to American Lung Association State of the Air Report, Pittsburgh, PA had the highest particle pollution in the nation on an annual basis and Bakersfield, CA had the highest number of days exceeding the daily level. Seven cities averaged particulate levels higher than the 15 ug/m3 current standard allows: Bakersfield, CA; Hanford, CA; Los Angeles, CA; Visalia, CA; Fresno, CA; Pittsburgh, PA; and Phoenix, AZ.

Thursday, August 9, 2012

Sinkholes In America

According to the Infrastructure Report Card from the American Society of Civil Engineers, New York’s waste water infrastructure needs an investment of $21.82 billion over the next 20 years to avoid failure. Just to emphasize the point in Bay Ridge Brooklyn not far from where I once lived in the 1970’s, the street suddenly caved in on August 1st and a 30 foot wide, 10 foot deep sinkhole appeared. This was the second sinkhole in this neighborhood of Brooklyn this year. The earlier sinkhole was twice as large. The city of New York immediately began repairs. According to the City, the sinkhole was caused by a 112 year old sewer pipe that gave way and washed away the soil under the road. The sewer pipe had no doubt been failing for a long time to erode that much soil.

Picture by CBS 2 Bay Ridge, Brooklyn sinkhole

The pipes made early in the 20th century were rated for an 80 year life span. New York sewer piping systems have reached the end of their design lives, but there are no plans of a replacement and upgrade of the system. There has been no systematic replacement program. New York has kept the sewer system alive by repairing what breaks after it breaks and spraying it’s mains with concrete. When the sewer systems were built in the 19th and 20th centuries, the Boroughs of New York still had autonomy and financed and built their own systems to address the problem in the least expensive way, by using a combined storm and sanitary sewer system. The Manhattan and Brooklyn systems were built around the same time modeling the designs of Hamburg, Germany. Nonetheless, the sewers were never designed for the ages, they need to be maintained and upgraded. New York and most of the United States has not maintained and upgraded our sewer systems. According to the U.S. Environmental Protection Agency, EPA, by 2020 half the sewer pipes in the U.S. will be crumbling and the U.S. risks reversing public health, and environmental gains of the past three decades.(Rose George in The Big Necessity, The Unmentionable World of Human Waste and Why it Matters.)

Failing sewer pipes can pose a significant threat to public health and the environment.  Systems with inadequate hydraulic capacity and/or blockages in the sewer pipes may lead to sanitary sewer overflows and sewage backing up into homes or onto streets.  Some of the health hazards associated with basement flooding by untreated wastewater include the potential presence of pathogenic microorganisms such as viruses, bacteria, and protozoa (in a nice moist dark environment). Pipe failures can be grouped into three general categories: hydraulic restrictions (e.g. blockage), hydraulic capacity, and structural condition.

Hydraulic restrictions are the most common problem in wastewater collection systems. Untreated wastewater carries sediment, grease, rags and whatever else you can think to put down the drain or flush down the toilet. The grease congeals with sediment and blockages form. In combined sewers, large items thrown or washed into the storm system create obstructions and restrictions during peak flow times. This can lead to street and basement flooding. Tree root intrusion, sediment accumulation, and grease build-up all contribute to hydraulic restrictions. Aging pipes that sag over time and joints that begin to fail can slow pipe flow and create more favorable conditions for solids to build up in pipes. An ongoing maintenance program for cleaning and flushing sewers is typically adequate to control blockages. However, budgeting for routine preventative maintenance and repairs has not taken place in New York or the rest of the United States. We fix it when it breaks.

Failure caused by inadequate flow capacity is common in combined sewer systems, but may also a sign of other types of problems such as structural defects or design defects. Major sources of pipe structural defects are cracks, broken pipes, and leaks. Pipes sag and sewage does not flow properly, areas of inadequate pipe slope can be due to loss of pipe bedding or inadequate initial slope. Structural failure as in Bay Ridge is typically caused by defects of the pipe wall and/or the soil envelope used to support the pipe.

The EPA reported that the total investment needs of America's publicly owned treatment works as of January 1, 2004, was $202.5 billion and increasing each year. Many systems besides New York have reached the end of their useful design lives, and the nation’s wastewater systems are no longer resilient in their ability to prevent failure, or restore service after a disruption. In addition, the electrical generation and distribution system, the energy sector, contributes to the lack of system’s resilience. Pumps and wastewater treatment plants do not operate without power and reduced reliability of the power supply is increasingly being addressed through the construction of dedicated emergency power generation at wastewater treatment plants and drinking water delivery systems.  Clean and safe water should be the national priority. If we had spent even a quarter of the  $821 billion of the American Recovery and Reinvestment Act of 2009 on the very unglamorous sewers and water treatment systems if the country we would have a cleaner environment, part of our infrastructure would be ready for another century and construction workers would have had jobs. 


Monday, August 6, 2012

NASA’s Curiosity Rover Lands on Mars

Curiosity Rover during a test at MSL


Last night at 1:32 am (eastern time) the rover Curiosity, a large mobile laboratory, was set down inside the Gale Crater by NASA’s Mars Science Laboratory, MSL, mission. The one-ton rover, hanging from a rocket backpack, touched down onto Mars after a 36-week flight using precision landing technology to place Curiosity in the crater. The MSL spacecraft that carried Curiosity succeeded in every step of the most complex landing ever attempted on Mars, including the final severing of the bridle cords and flyaway maneuver of the rocket backpack.

Now Curiosity begins a two-year investigation to delve into the secrets of Mars, the Red Planet. The rover will analyze samples scooped from the soil and drilled from rocks. The record of the planet's climate and geology is essentially "written in the rocks and soil" -- in their formation, structure, and chemical composition. The rover's onboard laboratory will study rocks, soils, and the local geologic setting in order to detect forms of carbon, the chemical building blocks of life on Mars. Curiosity carries a radioisotope power system that generates electricity from the heat of plutonium's radioactive decay. This power source gives the mission an operating lifespan on Mars' surface of a full Martian year (687 Earth days) or more and hopefully be able to gather enough data to assess what the Martian environment was like in the past.

Below is one of the first few images from Mars. According to John Grotzinger, project manager of NASA's MSL mission, at the California Institute of Technology in Pasadena. "In the image, we are looking to the northwest. What you see on the horizon is the rim of Gale Crater. In the foreground, you can see a gravel field. The question is, where does this gravel come from? It is the first of what will be many scientific questions to come from our new home on Mars." These first images of Mars are only half the size (in data) of the full-resolution Hazcam images that are expected to be sent back to Earth over the next several days. Check the NASA MSL site occasionally for new pictures and updates.
Gale Crater Mars August 6, 2012

Drought, Ethanol and the World Hunger Games


It is early August and the few acres of clover, weeds and grass that surround my house are once more green and growing. All through the spring and early summer I have kept a close watch on the water level in the U.S. Geological Survey, USGS, groundwater monitoring well up the road and read with anticipation each week Mark Svoboda’s of the National Drought Mitigation Center weekly report. The USGS has been continually monitoring groundwater levels at a nearby well since 1979 and posting the level daily. It has been unusually hot and dry and I watched the water level troughs in April and late June each followed by enough rainfall to bring the water level in the monitoring well (and I assume my drinking water well) to normal levels and increase the depth and flow of the creek in the woods at the bottom of my land. We have managed to avoid drought around here. The Midwest and much of the Great Plains have not been as lucky.

Most of the Midwest of the county has experienced above-normal temperatures with July coming in at 5-10 degrees above normal. Five to ten degrees! The region continues to be impacted not only by oppressive heat, but also drought. Not enough rain has left desiccated pastures and widespread crop damages, farmers are culling their livestock and the fire risk is elevated. The drought persists but some rain has fallen sporadically over the region. In The Great Plains drought has continued to expand and the temperatures remained 5 to 10 degrees above normal there, too.  The drought continues to advance across more of eastern Nebraska, southeastern South Dakota, Kansas, Oklahoma and the Texas Panhandle, stressing pastures, crops, livestock/wildlife, and trees. The one cheerful note is southeastern Texas, which has continued to recover from last year’s drought over the past several months. Overall, about 60% or more of the lower 48 states are experiencing some level of drought. Drought is not everywhere, but it is significant.

The National Agricultural Statistics Service (NASS) of the U.S. Department of Agriculture will release the yield and production forecasts for the 2012 U.S. corn and soybean crops on Friday, August 10th .  The U.S. corn crop is the largest in the world.  The USDA has been cutting its U.S. preliminary corn crop forecasts as the drought has progressed. Last word was that 45% of the corn crop is now estimated to be in poor or very poorcondition. Iowa the biggest corn producing state had 37% of their crops listed as in fair condition. This drought follows significant flooding last year in several parts of the county that reduced overall corn crop yields to under 12.5 billion bushels.

Congress has been deadlocked on passing a full farm bill because they don’t have enough support in the Senate for the five-year farm bill that came out of the House.  Instead with potentially half the corn crop  lost to the drought and pressure from cattle producers and other livestock producers who are worried about the cost of buying feed or culling their herds, the U.S. House passed a $383 million emergency relief package for livestock producers affected by the drought. The bill would have allowed payments of up to $100,000 per farm, for cattle and sheep ranchers but not hog and poultry farmers. Row crop farmers have insurance programs available to them, but the livestock programs expired in 2011 and this bill was an attempt to fill the gap. The Senate did not pass the drought measure before their five week recess on Friday and it was tossed into the pile of unfinished business.

In wet weather and dry weather we are the largest producer of corn in the world, but we have a problem that nature and Congress created together, The Renewable Fuel Standard, RFS, creating a regulatory mandated demand for corn. Last year the RFS mandated ethanol consumed 5.05 billion bushels of corn. Almost two thirds of the nation is in drought, and according to the most recent USDA report only 26% of the corn crop is in good or better condition, there are estimates that more than half of the corn crop is gone and still we have to meet the RFS.  The original United States Renewable Fuel Standard required that 7.5 billion gallons of renewable fuel (mostly ethanol made from corn) was to be blended into gasoline by 2012, but the program was expanded under the Energy Independence and Security Act (EISA) of 2007, which increased the volume of renewable fuel required to be blended into gasoline from 9 billion gallons in 2008 to 36 billion gallons by 2022. Last year, approximately 40% of the corn crop was used for making ethanol.

With half of this year’s corn crop potentially destroyed by drought added to last year’s flood reduced yield, and lower corn inventories; the RFS will make the crop situation worse by diverting most of the remaining corn crop into fuel leading to diminished supplies for livestock and food producers.  It is the unrelenting demands of the RFS against the livestock and food producers. We should not have to choose fuel over food or more likely have to import corn to feed our nation while we pay to convert corn into subsidized ethanol.

On Thursday, Bob Goodlatte of Virginia and 155 other member ofcongress sent a letter to Administrator Lisa Jackson of the U.S. EnvironmentalProtection Agency, EPA asking the Administrator to exercise her authority under the Clean Air Act section 211 (o) 7 to reduce the required volume of renewable fuel based on harm to the economy. This is a wonderful opportunity to actually live in harmony with nature and prevent the cost of food from rising even more and prevent us from taking food from the mouths of poorer nations. Yes, we can buy more corn if need be. The United States is a rich country and we will eat meat and the long list of food made from corn products. According to Tyler Cowen, professor of Economics at George Mason University, in his book, An Economist Gets Lunch, New Rules for Everyday Foodies, “(To put ethanol into gasoline) costs a lot more money than does traditional gasoline, once the cost of the subsidy is included. Sadly, it does not even make the environment a cleaner place. The energy expended in growing and processing the corn is an environmental cost too…the nitrogen-based fertilizers used for the corn are major polluters. Ethanol subsidies are a lose-lose policy on almost every front, except for corn farmers and some politicians.” “For millions of (people in poor countries) it is literally a matter of life and death and yet we proceed with ethanol for no good reason…(Biofuels) has thrown millions of people around the world back into food poverty.” Is it our goal to be the people of the Capital of Panem and have tributes from poorer nations play the Hunger Games? 

Thursday, August 2, 2012

Purchasing a Home with a Water Well-Caveat Emptor


In rural areas or here on the edge of nowhere private wells supply water to homes. If you are thinking about buying a home with a private water well you need to understand at a minimum the basics about groundwater, the local geology, water quality, how the well system works, how deep the well is and how old and what size pump it has. These are the factors that will impact water reliability, water quantity, and water quality.  It would be a real shame to discover after closing on home that the drinking water well does not supply enough water for you to do laundry in the summer, goes dry in a drought or that the water is contaminated or has an unpleasant taste or smell. It can be very expensive to replace a well or well components, engineer solutions to water supply problems and install and maintain a water treatment systems. A home inspection tells you nothing about the well or septic system.  

About 15% of households in the United States depend on private wells including over a million each in Virginia and Pennsylvania where I once lived. In its most basic sense a private water well is a hole in the ground that is drilled, driven, or hand dug to supply water for a household. Most wells today are drilled by a cable tool or by air-rotary drill. Hand-dug wells are usually very old but still exist; and are very susceptible to pollution from surface sources and may also present easy routes for surface contaminants to enter the aquifer. If considering purchasing a home with an old hand dug or driven well factor in the cost of replacing the well with a modern drilled well in the price and be aware that not all pieces of land have suitable aquifers to tap.

The aquifer is the groundwater. Aquifers may occur a few feet below the land surface, but useful aquifers are more commonly found at depths greater than 100 feet in Pennsylvania and 100-400 feet beneath the bedrock in Virginia. Some groundwater occurs in the pore spaces of solid rock, but most usable groundwater occurs in cracks and fractures in rock layers or between sand and gravel particles of unconsolidated layers. Except in Karst terrain which has its own special problem, groundwater normally occurs in small spaces within the geological layers and not as underground lakes or rivers.

 Geologic formations called aquitards are usually made of clay or dense solid rock. The aquidards inhibit groundwater infiltration, and restrict groundwater movement into and between aquifers. Aquitards located above and below an aquifer form a confined aquifer. If a well is drilled into a confined aquifer, artesian pressure forces the trapped water to rise in the well above the aquifer if the pressure is great enough, the water may even flow without pumping to the land surface creating an artesian well. The downside to a confined aquifer is that recharge is limited. The coastal plain in Virginia has a confined aquifer and is really only recharged at the “fall line.” A groundwater aquifer without an aquitard above it is an unconfined aquifer and more susceptible to contamination, but more easily recharged by precipitation. Without pumping, the water level in wells in unconfined aquifers is the same as the aquifer. The county department of health, the extension office, the local offices of the U.S. Geological survey are good places to find out about the local geological and groundwater conditions. You need to understand what type of aquifer you are dealing with to be aware of the factors that impact water quantity and quality. There are dry years and wet years and water availability will vary, though it is not always obvious. The groundwater aquifer tapped for water is not seen so you need to understand it to be aware of the water budget that you will have to live within before you run out of water.

In many locations private wells are not regulated or only minimally regulated. Virginia now has well drilling regulations and standards, but those only apply to wells drilled after 1992 and require a health department permit for drilling new wells and repairs of older wells.  If you buy a home with an existing well- buyer beware. It is your responsibility to make sure that you know what you are buying. The type of well, the well yield, the condition of the well and the quality of the water is your responsibility to determine before purchase. The type of well and the configuration will be determined by the age of the well and geology.  While there was a time when some wells were hand dug with a shovel or hand driven using connected pieces of pipe (as featured in Hallmark channel movies) most wells use equipment to drill, dig or drive pipe and by and large modern wells are drilled. Nonetheless, there are thousands of home supplied with water from older wells. Ask, look, investigate. Check driller’s logs filed with the health department.

The type of well is determined by geology and history. Sitting as my home does in the Culpeper groundwater basin above fractured rock and bedrock, drilled wells are predominant. When drilling a well, it is typical here and now in Prince William Virginia in the twenty-first century to drill the well through a first and second layer of groundwater. I have a second groundwater level I can drop my pump down to if need be in a drought.

Drilled wells penetrate about 100-400 feet into the bedrock. To continually supply water, a drilled well must intersect bedrock fractures containing ground water. The art of well drilling is having a feel for what a fracture looks like at the surface- knowing where the water is. A trained and experienced hydro-geologist can generally find the fractured rock zones, or the intersection of two fractured rock zones using aerial photography and the fracture trace technique. It can be expensive to hire a professional hydro geologist, but the cost is worthwhile for difficult areas.  Where I live in the Piedmont region of Virginia the local geology is a fractured rock system that is water rich with more than one groundwater layer, so using hydro geologists is not common, but I do know of a couple of instances where several wells were drilled before obtaining adequate water flow and it might have been more cost effective to locate water before the house was built or to design a different well system. The Piedmont tends to be so water rich that alternatives are not in common use. In some regions low production wells are common.

If a property has a low producing well, there are ways to deal with it. First is water conservation and the second is to increase water storage within the system. Water conservation involves changing water–use behavior such as taking shorter showers, but usually involves installing water saving devices like a front-loading washer (saves 20 gallons of water for each load),low flush toilets, flow restricting faucets and shower heads. Installing watersaving appliances can reduce household water use by up to 30%. Water conservation may solve the problem of a 5 gallon a minute well, but increasing water storage can make a reliable 1 gallon a minute well viable for a modern household. An intermediate storage system consisting of a storage tank ,reservoir or cistern that can be installed between the well and pressurized distribution system. The reservoir serves as the primary source of supply for the pressure pump supplying peak demand. Ideally, the storage tank or cistern should be able to hold at least a day’s water supply and be regulated by a float switch or water level sensor. A 1 gallon a minute well can pump 1,440 gallons per day more than adequate for a household of almost any size.  The rule of thumb is to size a storage tank or cistern at 100 gallons per person in the household. It is much cheaper to installed intermediate storage than keep drilling wells.   

Once you have determined that the water supply is adequate, the water quality should be checked. The quality of the groundwater is a characteristic of the aquifer and the ability of your local geology to protect or impact your aquifer. The most common sources of pollution to groundwater supplies come from two categories; naturally occurring ones and those caused by human activities. Naturally occurring contamination are produced from the underlying soil and rock geology. Microorganisms in the soil can travel into groundwater supplies through cracks, fissures, and other pathways. Nitrates and nitrites from the nitrogen compounds in the soil can also enter the groundwater. From the underlying rocks radionuclides and heavy metals can enter the groundwater. There are areas with natural occurring arsenic, cadmium, chromium, lead, selenium and fluoride. While many natural contaminants such as iron, sulfate, and manganese are not considered serious health hazards, they can give drinking water an unpleasant taste, odor, or color.

Human activities can also introduce contaminants into thegroundwater. Bacteria and nitrates can be caused by human and animal waste. Improperly constructed and sealed wells can allow surface contamination to enter the well. Improperly maintained septic systems containing human waste and any chemical you flush down the drain, horses, and backyard poultry can contaminate the groundwater. Leaks from underground storage tanks, surface disposal of solvents, motor oil, paint, paint thinner, termite treatment or nearby or historic landfills or industrial operations can contaminate groundwater. A confining geological layer can protect groundwater from surface contaminants more effectively than a fractured rock system, and there is extremely limited natural protection in karst terrain. Though it is cost prohibitive to test for every potential contaminant, a broad baseline analysis should be performed before purchasing a home (and every few years). A bacteria test is not enough. The cheapest way to do this is a commercial product aimed at the private homeowner. One product I have used is the WaterCheck with Pesticides. This product covers 15 heavy metals, 5 inorganic chemicals, 5 physical factors (like hardness and pH), 4 trihalo methanes, 43 volatile organic chemicals (solvents), and 20 pesticides, herbicides and PCB’s. The analysis takes two weeks and so the contingency period must allow for that or a more expensive analysis must be used. Whatever analysis you use, make sure you use an EPA certified laboratory that is also certified in the State where the property is located.

All private wells should be have as a water-tight, vermin-proof well cap and a cement or bentonite grout seal between the borehole and the well casing to prevent surface contamination and bacteria from entering the well. In coal or gas country, a well should include a vent. Well water should be tested annually to ensure a safe drinking water supply for your family. The supply of water should be adequate. In our modern world household water demand is not spread evenly over the day. There are peak usage times driven by washing machines, dish washers, showers. An adequate water system must be able to yield enough water to satisfy peak demand. Look for a minimum of 10 gallons a minute from a modern drilled well to supply a household or a system with intermediate storage (remember in some instances the well itself can provide storage). To live comfortably with your well, your water system has to be able to deliver an entire day’s worth of clear uncontaminated water within a 90 minute window.