On Christmas eve 2016 a 250-foot-long, 100-foot-wide sinkhole opened up in Fraser, Michigan, a suburb of Detroit. Authorities believe it formed after an 11-foot-wide sewer pipe burst 55 feet below ground; however, a sinkhole this massive means that the leak was ongoing for some time and may indicate other problems. The hole continued to grow over the holiday weekend and authorities say the ground won't be safe enough for residents to return for at least two weeks. The Mayor of Fraser, Michigan has declared a state of emergency. Gas and water have been shut off and engineers and contractors work to steady the sinkhole and start filling it back in. Three homes have been destroyed, 22 families evacuated.
The sinkhole runs along 15 Mile Road, which divides the two communities of Fraser and Clinton Township. It's expected to shut down 15 Mile for several months. This isn't the first time a sinkhole has struck the area. This is at least the third time that a huge sewer pipe has failed in this immediate area. Most recently the same road caved in in 2004. Contracts for repairs on the same 11-foot-diameter pipe the cause of the 2004 sinkhole were at the center of racketeering charges against former Detroit Mayor Kwame Kilpatrick.
Macomb and Oakland counties have spent about $170 million on sewer infrastructure repairs during the past 12 years, designed specifically to prevent this type of catastrophic failure from happening again. The 2004 collapse took more than $50 million and 10 months to fix, and this sinkhole is just outside that repair zone. According to a lawsuit filed against the Detroit Water and Sewerage Department by area residents following the 2004 sinkhole, the same line collapsed in the same location in 1978, only six years after construction was completed.
Repeated catastrophic sewer line failures involving pipes within their operational life span is usually caused by one or more of the following three things: poor construction, poor engineering and design, or a lack of maintenance. Macomb public works officials will have to determine why, specifically, the latest sinkhole occurred.
The sewer lines move about 70 million gallons a day of wastewater from the suburban Macomb and Oakland counties to the Detroit wastewater treatment plant. The city of Detroit owned the sewer line system that included the 15 Mile Interceptor until 2009, when Oakland and Macomb counties each took over ownership of their respective sewer infrastructure. Inspections following the 2004 sinkhole revealed several miles of sewer lines in need of significant repair, and transferred the piping system to the counties, which had better bond ratings than Detroit. This allowed the sewer system to borrow the money at a lower rate making the work more affordable.
However, Detroit managed the work and former Detroit Mayor Kwame Kilpatrick is now serving 28 years in federal prison for multiple crimes including contract-fixing on the $54.3-million contract for the repair of the 2004 sinkhole on 15 Mile Road in Sterling Heights. In court documents, Mayor Kilpatrick has denied any wrongdoing in the sinkhole repair and has argued that the Detroit water department was “completely responsible for every administrative decision" that was made during the job.
Right now the raw sewage is being diverted to the Clinton River due to the wet weather. Officials say there should be no problems with dry weather sanitary sewer flows, as these flows are still being routed through the collapsed interceptor pipe. Rainfall or snowmelt, however, will overtax the system as occurred during recent rains and snowmelt over the holiday weekend. Temporary measures to mitigate the environmental impact from the sewage release are being investigated. The underlying cause why this particular area is experiencing multiple sinkholes when similarly old and neglected sewer pipelines have not remains unanswered.
Thursday, December 29, 2016
Monday, December 26, 2016
EPA Bans TCE at Dry Cleaners
The U.S. Environmental Protection Agency (EPA) announced this month that it is proposing to ban trichloroethylene (TCE) due to health risks when used as a degreaser and a spot remover in dry cleaning. Specifically, EPA is proposing to prohibit manufacture or import, processing, and distribution of TCE for use in aerosol degreasing and for use in spot cleaning in dry cleaning facilities. The ban will go into effect in 60 days. The Administrative Procedure Act requires that agencies issue a notice of proposed rulemaking , provide an opportunity for public comments, issue a final rule with a concise statement of its basis and purpose, and make the final rule effective a minimum of 30 days after publication in the Federal Register.
TCE is a volatile organic compound (VOC). It is a clear, colorless liquid that has a sweet odor and evaporates quickly. It is a dense non aqueous phase liquid that can pass rapidly through cracks and imperfections in concrete and asphalt and through the materials themselves and can travel great distances in groundwater. EPA estimates that 250 million pounds of TCE are used each year in the United States.
TCE is a toxic chemical with human health concerns. EPA identified serious health risks to workers and consumers associated with TCE in a 2014 assessment that concluded that the chemical can cause a range of adverse health effects, including cancer, development and neurotoxicological effects, and toxicity to the liver.
In 1930, trichloroethylene (TCE) was introduced as a dry cleaning solvent in the United States. TCE was found to cause the bleeding of some acetate dyes at temperatures above 75 degrees Fahrenheit, so it was never widely used in this country as a primary dry cleaning solvent. TCE is; however, still widely used as a dry-side pre-cleaning or spotting agent and in water repellent agents. Nothing removes lipstick from silk like TCE and it is the principle ingredient in Fast PR, 2-1 Formula, Picrin, Puro, SemiWet Spotter, Spra-Dri and Volatile Dry spotter (V.D.S.).
The majority (about 84 %) of TCE is used in a closed system as an intermediate chemical for manufacturing refrigerant chemicals. Much of the remainder (about 15 %) is used as a solvent for metals degreasing, leaving a small percentage to account for other uses, including use as a spotting agent in dry cleaning and in consumer products. This rule follows a July 2015 agreement that EPA reached with manufacturers to voluntarily phase-out the use of TCE in its aerosol arts and crafts spray fixative product and ensure that EPA will have the opportunity to review any effort to resume or begin new consumer uses of TCE.
EPA also found risks associated with TCE use in vapor degreasing, and the agency is developing a separate proposed regulatory action to address those risks. Last week, EPA announced the inclusion of TCE on the list of the first ten chemicals to be evaluated for risk under TSCA. That action will allow EPA will evaluate the other remaining uses of the chemical. This month’s action only proposes to ban use as a degreaser and spot remover in dry cleaning.
TCE is a volatile organic compound (VOC). It is a clear, colorless liquid that has a sweet odor and evaporates quickly. It is a dense non aqueous phase liquid that can pass rapidly through cracks and imperfections in concrete and asphalt and through the materials themselves and can travel great distances in groundwater. EPA estimates that 250 million pounds of TCE are used each year in the United States.
TCE is a toxic chemical with human health concerns. EPA identified serious health risks to workers and consumers associated with TCE in a 2014 assessment that concluded that the chemical can cause a range of adverse health effects, including cancer, development and neurotoxicological effects, and toxicity to the liver.
In 1930, trichloroethylene (TCE) was introduced as a dry cleaning solvent in the United States. TCE was found to cause the bleeding of some acetate dyes at temperatures above 75 degrees Fahrenheit, so it was never widely used in this country as a primary dry cleaning solvent. TCE is; however, still widely used as a dry-side pre-cleaning or spotting agent and in water repellent agents. Nothing removes lipstick from silk like TCE and it is the principle ingredient in Fast PR, 2-1 Formula, Picrin, Puro, SemiWet Spotter, Spra-Dri and Volatile Dry spotter (V.D.S.).
The majority (about 84 %) of TCE is used in a closed system as an intermediate chemical for manufacturing refrigerant chemicals. Much of the remainder (about 15 %) is used as a solvent for metals degreasing, leaving a small percentage to account for other uses, including use as a spotting agent in dry cleaning and in consumer products. This rule follows a July 2015 agreement that EPA reached with manufacturers to voluntarily phase-out the use of TCE in its aerosol arts and crafts spray fixative product and ensure that EPA will have the opportunity to review any effort to resume or begin new consumer uses of TCE.
EPA also found risks associated with TCE use in vapor degreasing, and the agency is developing a separate proposed regulatory action to address those risks. Last week, EPA announced the inclusion of TCE on the list of the first ten chemicals to be evaluated for risk under TSCA. That action will allow EPA will evaluate the other remaining uses of the chemical. This month’s action only proposes to ban use as a degreaser and spot remover in dry cleaning.
Thursday, December 22, 2016
Will Young Blood Help Alzheimer Patients
In the news last week was the failure of an experimental Eli Lily & Co. drug to measurably help Alzheimer’s disease patients. This failure in the third round of clinical trials revives doubts about the Alzheimer’s research focus that has dominated research in the past decade- preventing beta amyloid plaques. Lilly and other companies have spent hundreds of million of dollars developing drugs that try to stop the buildup of in the brain of beta amyloid plaques, a sticky protein that is believed by many to be the primary culprit in the disease. However, none of these drugs has worked in a major patient study, and Lilly’s failure could accelerate the pursuit of different research strategies.
A different approach is underway at Stanford School of Medicine in Palo Alto, California. Researchers at Stanford are engaged in a clinical trial where Alzheimer suffers are given a transfusion of blood plasma from young people. The Stanford team is engaged in giving a small number of patients a transfusion of blood plasma donated by people under 30 to older volunteers with mild to moderate Alzheimer's. The caretakers of the patients are keeping journals to try and track any changes. Unfortunately, it is devilishly difficult to measure Alzheimer’s objectively. Nonetheless, the scientists are hopeful.
The Stanford team lead by Dr. Tony Wyss-Coray are not the first to wonder whether the answer to the problem of ageing might lie in human blood. One of the first physicians to propose blood transfusions to rejuvenate older people was Andreas Libavius, a German doctor and alchemist in the 1600’s.
In more modern times, years of animal research suggests the transfusion of young blood can improve organ and brain health. Dr. Amy Wagersat, of Harvard University discovered in 2012 that a protein in the blood plasma called growth differentiation factor 11 (GDF11) seemed to be linked to rejuvenating effects in old mice with a heart condition.
In Dr. Wagerstat’s experiments the ageing mice were given injections of GDF11 for 30 days, their hearts responded just as they had when they were given blood of younger mice. This appeared to verify that GDF11 is one of the ingredients in blood responsible for rejuvenation. This lead Dr. Wyss-Coray and his team to test plasma on brain function. The Stanford team believes that GDF11 could also play a role in rejuvenating the brain in humans.
The studies all point in one direction. Among the hundreds of substances found in blood are proteins that keep tissues youthful, and proteins that make them more aged. Dr. Wyss-Coray has a hypothesis: when we are born, our blood is awash with proteins that help our tissues grow and heal. In adulthood, the levels of these proteins plummet. The tissues that secrete them might produce less because they get old and wear out, or the levels might be suppressed by the genetic programing. Either way, as these pro-youthful proteins vanish from the blood, tissues around the body start to deteriorate.
If the test of young human plasma is successful, what then. As Dr. Wyss-Coray said; “If it actually works? That would be huge. Every patient would want it.” If the trial works, there are ways to obtain plasma. A startup company has now launched the first clinical trial in the United States to test the anti-aging benefits of young blood in relatively healthy people. It's a pay-to-participate trial. Ambrosia in Monterey, California, plans to charge participants $8,000 for lab tests and a one-time treatment with young plasma. The trial is open to anyone 35 and older with cash to spend.
A different approach is underway at Stanford School of Medicine in Palo Alto, California. Researchers at Stanford are engaged in a clinical trial where Alzheimer suffers are given a transfusion of blood plasma from young people. The Stanford team is engaged in giving a small number of patients a transfusion of blood plasma donated by people under 30 to older volunteers with mild to moderate Alzheimer's. The caretakers of the patients are keeping journals to try and track any changes. Unfortunately, it is devilishly difficult to measure Alzheimer’s objectively. Nonetheless, the scientists are hopeful.
The Stanford team lead by Dr. Tony Wyss-Coray are not the first to wonder whether the answer to the problem of ageing might lie in human blood. One of the first physicians to propose blood transfusions to rejuvenate older people was Andreas Libavius, a German doctor and alchemist in the 1600’s.
In more modern times, years of animal research suggests the transfusion of young blood can improve organ and brain health. Dr. Amy Wagersat, of Harvard University discovered in 2012 that a protein in the blood plasma called growth differentiation factor 11 (GDF11) seemed to be linked to rejuvenating effects in old mice with a heart condition.
In Dr. Wagerstat’s experiments the ageing mice were given injections of GDF11 for 30 days, their hearts responded just as they had when they were given blood of younger mice. This appeared to verify that GDF11 is one of the ingredients in blood responsible for rejuvenation. This lead Dr. Wyss-Coray and his team to test plasma on brain function. The Stanford team believes that GDF11 could also play a role in rejuvenating the brain in humans.
The studies all point in one direction. Among the hundreds of substances found in blood are proteins that keep tissues youthful, and proteins that make them more aged. Dr. Wyss-Coray has a hypothesis: when we are born, our blood is awash with proteins that help our tissues grow and heal. In adulthood, the levels of these proteins plummet. The tissues that secrete them might produce less because they get old and wear out, or the levels might be suppressed by the genetic programing. Either way, as these pro-youthful proteins vanish from the blood, tissues around the body start to deteriorate.
If the test of young human plasma is successful, what then. As Dr. Wyss-Coray said; “If it actually works? That would be huge. Every patient would want it.” If the trial works, there are ways to obtain plasma. A startup company has now launched the first clinical trial in the United States to test the anti-aging benefits of young blood in relatively healthy people. It's a pay-to-participate trial. Ambrosia in Monterey, California, plans to charge participants $8,000 for lab tests and a one-time treatment with young plasma. The trial is open to anyone 35 and older with cash to spend.
Monday, December 19, 2016
EPA Issues their Fracking Report
“Because of the significant data gaps and uncertainties in the available data, it was not possible to fully characterize the severity of impacts, nor was it possible to calculate or estimate the national frequency of impacts on drinking water resources from activities in the hydraulic fracturing water cycle.” However, a nice addition to the final report was the “Synthesis” section. This sections written for state and local regulators identifies the factors that can be managed, changed, or used to help reduce current vulnerabilities of drinking water resources to activities in the hydraulic fracturing water cycle. Since the overall rate that fracking does impact our water resources is low, this section is excellent suggestions for reducing the risk even further through regulation and monitoring.
The final report examines all the potential vulnerabilities in the water lifecycle that could impact the 3,900 potential drinking water sources for the 8.6 million people living within one mile of a hydraulically fractured well. EPA’s assessment relies on existing scientific literature and data. Literature evaluated included articles published in science and engineering journals, federal and state government reports, non-governmental organization (NGO) reports, and industry publications. Data was gathered from databases maintained by federal and state government agencies, other publicly-available data and information, and data, including confidential and non-confidential business information, submitted by industry to the EPA.
EPA concluded that there are above and below ground ways that hydraulic fracturing can potentially impact drinking water resources. These mechanisms include water withdrawals in times of drought, or in areas with, limited water availability; spills of hydraulic fracturing fluids and produced water; fracking directly into underground drinking water resources; below ground migration of liquids and gases from inadequately cased or cemented wells; and inadequate treatment and discharge of wastewater.
EPA did not find evidence that there has been widespread impacts on drinking water resources; but did find specific instances where one or more failures in design, well completion and fluid storage led to contamination of drinking water wells. The number of identified cases, however, was small compared to the number of hydraulically fractured wells reported by EPA. This low level of documented contamination might be due to insufficient pre- and post-fracturing data on the quality of drinking water resources; the lack of long-term systematic studies of areas and lack of careful monitoring of groundwater resources and the presence of other sources of contamination precluding a definitive link between hydraulic fracturing activities and an impact when using fracking to redevelop and extend the life of existing oil and gas wells.
The assessment follows the water used for hydraulic fracturing from water acquisition, chemical mixing at the well pad site, well injection of fracking fluids, the collection of hydraulic fracturing wastewater (including flowback and produced water), and wastewater treatment and disposal. Though cumulatively, hydraulic fracturing used on average 44 billion gal of water a year in 2011 and 2012, this represented less than 1% of total annual water used in the United States. However, in areas of drought the need for a median of 1.5 million gallons to frack a well could impact water supplies.
The median amount of water used to frack a well is determined by well length, formation geology and the fracking fluid formulation. The overall amount of water used in fracking is a fraction of the daily water use in the United States.
EPA found that surface spills were by far the most common incident reported, though they are also the most easily observed. The reported volume of fracturing fluids or chemicals spilled ranged from 5 gallons to more than 19,000 gallons, with a median volume of 420 gallons per spill. Spill causes included equipment failure, human error, failure of container integrity, and other causes (e.g., weather and vandalism). The most common sited cause was equipment failure. The frequency of on-site spills from hydraulic fracturing could be estimated for only two states. If the estimates are representative, the number of spills nationally could range from approximately 100 to 3,700 spills annually, assuming 25,000 to 30,000 new wells are fractured per year.
EPA identified a list of 1,076 chemicals used in hydraulic fracturing fluids over multiple wells and years. These chemicals include acids, alcohols, aromatic hydrocarbons, bases, hydrocarbon mixtures, polysaccharides, and surfactants. According to the EPA’s analysis of disclosures, the number of unique chemicals per well ranged from 4 to 28, with a median of 14 unique chemicals per well. In addition, EPA reports an estimated 9,100 gallons of chemicals are mixed with the median 1.5 million gallons of water per well. Given that the number of chemicals per well ranges from 4 to 28, the estimated volume of chemicals injected per well may range from approximately 2,600 to 18,000 gallons.
EPA found that groundwater can be impacted by fracking fluids or methane gas if the casing or cement on a well are inadequately designed or constructed, or fail. A study done in the Williston Basin in North Dakota suggests that the risk of groundwater contamination from leaks inside the well decreases by a factor of approximately one thousand when the well casing extends below the bottom of the drinking water aquifer.
EPA suggests that fracking of older wells to restore production can contribute to casing degradation and failure, which can be accelerated by exposure to corrosive chemicals, such as hydrogen sulfide, carbonic acid, and brines. No data was provided on this risk. The study found that one of the best protections for groundwater is the physical separation between the gas production zone and groundwater resources. Many hydraulic fracturing operations target deep formations such as the Marcellus Shale or the Haynesville Shale, where the vertical distance between the base of drinking water resources and the top of the shale formation may be a mile or greater.
However, not all hydraulic fracturing is performed in zones that are deep below drinking water resources, but should be. The EPA’s survey of oil and gas production wells hydraulically fractured in 2009 and 2010 estimated that 20% of wells had less than 2,000 feet of vertical separation between the point of shallowest hydraulic fracturing and the base of the protected groundwater. There are also places in the subsurface where oil and gas resources and drinking water resources co-exist in the same formation. When hydraulic fracturing occurs within these formations the process injects of fracturing fluids into formations that may currently serve, or in the future could serve, as a source of drinking water for public or private use and should not be allowed.
Water, of variable quality, is a byproduct of oil and gas production. After hydraulic fracturing, the injection pressure is released and water flows back from the well. Initially this water is similar to the hydraulic fracturing fluid, but as time goes on the composition is affected by the characteristics of the formation and possible reactions between the formation and the fracturing fluid. EPA calls all this water produced water. The final area that EPA looked at was management of the produced water. Hydraulic fracturing generates large volumes of produced water that require management. In 2007, approximately one million active oil and gas wells in the United States generated 2.4 billion gallons a day of wastewater. It is unknown what portion of this total volume is produced by hydraulically fractured wells, but really, after the flowback period there is little difference.
As is pointed out in the report wastewater management and disposal could impact drinking water resources. Inadequate treatment of wastewater could result in discharge of contaminated water to rivers and streams. Though, in recent years a larger proportion of fracking produced water is reused. In addition, spills can occur during transportation of wastewater away from the well head or spills and leaks from wastewater storage pits. Also, migration of contaminants from inappropriate use of land application to dispose of of wastewater, and other inappropriate methods of wastewater treatment.
The take away is there is still the need for more research to be able to fully model and understand fracking. In addition there is a need for more extensive baseline studies prior to drilling and long term monitoring to even know if water (or human health) has been impacted by fracking. Predrilling data needs to include measurements of groundwater and surface-water quality and quantity. There have been virtually no comprehensive studies on the impact of fracking on human health while state regulators and laws in some instances allow fracking virtually in people’s backyards and without adequate vertical separation from groundwater supplies. Fracking needs to be well understood and the risks managed to make sure that is a boon to mankind and only is used in appropriate geology and low risk locations. Many of the sources of contamination identified by EPA could be prevented or reduced with more care, training and thought. Overall, hydraulic fracturing for oil and gas is a practice that continues to evolve and our methods and controls should continue to evolve with it.
Thursday, December 15, 2016
Dominion Power Offers Neighbors Public Water Connection
As reported by the Richmond Times-Dispatch, last week Dominion Power has announced that they will pay for the homes near the Possum Point Power Plant to be hooked up to the Prince William County Public Service Authority water or receive filtration systems. Offering the neighbors peace of mind and a safe source of drinking water is the right thing to do.
If you recall last fall Dominion Power agreed to install additional groundwater monitoring wells and conduct bi-weekly monitoring of the new wells. This brought the total number of monitoring wells to 24 and provided enhanced monitoring and protection for Quantico Creek, the upstream neighbors and the Potomac River from the dewatering of the coal ash ponds at Dominion’s Possum Point Power Station. The recent sampling of those wells showed elevated levels of boron, chloride, cobalt, nickel, sulfate and zinc upstream of the ponds. Though, Dominion maintains that there is no evidence that its ash ponds have contaminated drinking water wells near the site, they are offering neighbors a and says the new results seem inconsistent with what they assumed about the geology and groundwater. Groundwater often surprises you.
Several rounds of sampling from the Virginia Department of Health, the Virginia Cooperative Extension’s Household Water Quality Program, the Potomac Riverkeeper Network and a contractor hired by Prince William County have found varying levels of metals, such as hexavalent chromium, lead, antimony and other constituents that can be associated with coal ash, but are also present naturally in the groundwater of Virginia as well as low pH levels that corrode plumbing, in drinking water wells along Possum Point Road.
I reviewed test results from both our Household Water Quality Program and the Virginia Department of Health. The VDH tested the water for thirteen contaminants that are regulated under the Safe Drinking Water Act (arsenic, barium, beryllium, cadmium, total chromium, mercury, lead, antimony, selenium, thallium, radium). Though traces of various substances were found, none of the levels of contaminants were above the MCLs or SMCLs of the Safe Drinking Water Act so would be acceptable for public drinking water supplies. The VDH also tested for substances not regulated under the Safe Drinking Water Act. These contaminants were: boron, calcium, cobalt, lithium, magnesium, sodium, nickel, vanadium, zinc, alkalinity, bicarbonate alkalinity, carbonate alkalinity, hexavalent chromium, molybdenum, strontium, thorium, radium-228 and vanadium. The chart below shows the summary of results of what they found (you can request the information under the FOIA).
If you recall, Dominion Power has been moving forward with a plan to “close in place” 3.7 million cubic yards of coal ash under the recently finalized U.S. EPA Coal Ash regulation. The plan for Possum Point is to consolidate all of the on-site coal ash into one impoundment. There is estimated to be 3.7 million cubic yards of coal ash. Dominion has collected more than 1 million cubic yards of ash from four smaller ponds, put them in a 120-acre pond that already contains 2.6 million cubic yards of coal ash that they have begun to dewater. Ultimately, the pond will be capped with an impermeable membrane to prevent future infiltration of rain. The groundwater results do not necessarily indicate contamination from the coal ash ponds, and Dominion Power will continue to press forward obtain their permit to close the site.
Closing the coal ash on site when properly done is probably the best solution. A safe closure requires ongoing monitoring and maintenance that is best accomplished at an operating and regulated plant rather than at a remote cap and leave it location. All physical barriers fail over time this is addressed by monitoring and maintaining the systems.
If you recall last fall Dominion Power agreed to install additional groundwater monitoring wells and conduct bi-weekly monitoring of the new wells. This brought the total number of monitoring wells to 24 and provided enhanced monitoring and protection for Quantico Creek, the upstream neighbors and the Potomac River from the dewatering of the coal ash ponds at Dominion’s Possum Point Power Station. The recent sampling of those wells showed elevated levels of boron, chloride, cobalt, nickel, sulfate and zinc upstream of the ponds. Though, Dominion maintains that there is no evidence that its ash ponds have contaminated drinking water wells near the site, they are offering neighbors a and says the new results seem inconsistent with what they assumed about the geology and groundwater. Groundwater often surprises you.
Several rounds of sampling from the Virginia Department of Health, the Virginia Cooperative Extension’s Household Water Quality Program, the Potomac Riverkeeper Network and a contractor hired by Prince William County have found varying levels of metals, such as hexavalent chromium, lead, antimony and other constituents that can be associated with coal ash, but are also present naturally in the groundwater of Virginia as well as low pH levels that corrode plumbing, in drinking water wells along Possum Point Road.
I reviewed test results from both our Household Water Quality Program and the Virginia Department of Health. The VDH tested the water for thirteen contaminants that are regulated under the Safe Drinking Water Act (arsenic, barium, beryllium, cadmium, total chromium, mercury, lead, antimony, selenium, thallium, radium). Though traces of various substances were found, none of the levels of contaminants were above the MCLs or SMCLs of the Safe Drinking Water Act so would be acceptable for public drinking water supplies. The VDH also tested for substances not regulated under the Safe Drinking Water Act. These contaminants were: boron, calcium, cobalt, lithium, magnesium, sodium, nickel, vanadium, zinc, alkalinity, bicarbonate alkalinity, carbonate alkalinity, hexavalent chromium, molybdenum, strontium, thorium, radium-228 and vanadium. The chart below shows the summary of results of what they found (you can request the information under the FOIA).
If you recall, Dominion Power has been moving forward with a plan to “close in place” 3.7 million cubic yards of coal ash under the recently finalized U.S. EPA Coal Ash regulation. The plan for Possum Point is to consolidate all of the on-site coal ash into one impoundment. There is estimated to be 3.7 million cubic yards of coal ash. Dominion has collected more than 1 million cubic yards of ash from four smaller ponds, put them in a 120-acre pond that already contains 2.6 million cubic yards of coal ash that they have begun to dewater. Ultimately, the pond will be capped with an impermeable membrane to prevent future infiltration of rain. The groundwater results do not necessarily indicate contamination from the coal ash ponds, and Dominion Power will continue to press forward obtain their permit to close the site.
Closing the coal ash on site when properly done is probably the best solution. A safe closure requires ongoing monitoring and maintenance that is best accomplished at an operating and regulated plant rather than at a remote cap and leave it location. All physical barriers fail over time this is addressed by monitoring and maintaining the systems.
Monday, December 12, 2016
Vegetarians and Sustainability
According to Dr. Vaclav Smil formerly of the University of Manitoba and author of Should We Eat Meat, homo sapiens are naturally omnivore with a high degree of preference for meat consumption. However, we are no longer hunter gatherers increasing density of populations created the need to abandon hunting and gathering and progressively increase permanent settlements with crop farming. The agrarian life was accompanied by cultural adaptions of meat restrictions and taboos turned meat into a relative rare foodstuff for the majority of the population in traditional agricultural societies. The return to more frequent meat eating has been a transition in affluent economies.
World food production has changed over the centuries and so, too, has population health. Food production capabilities and capacity has increased greatly; maternal and child nutrition in high-income groups has improved; health and life expectancies have increased, at least partly because of nutritional gains; and refrigeration, transport, and open markets have increased year-round access to healthy foods for many populations.
In this holiday season I have been thinking about what food means within families, the many different styles of eating, and food sustainability. Our family holidays have traditional dishes that tell the story of our family. We also have vegetarians in the family who consume a diet of fruits, vegetables, grains, legumes, eggs, and dairy. A diet limited to only plant based food is called veganism. Then there are Pescetarians (sometimes called pesco-vegetarians) who eat freshwater and saltwater fish and shellfish in addition to the fruits, vegetables, grains, legumes, eggs, and dairy of the vegetarian. Omnivores eat everything (hopefully in moderation) and on it goes.
A study published in JAMA in 2015 found that among over 77,000 Seventh-Day Aventists who have a high number of vegetarians found that when compared with non-vegetarians, pesco vegetarinan had a 43% lower risk, and vegetarians had a 22 % lower risk for colorectal cancer which remains the second leading cause of cancer mortality in the United States. Dietary choices are important sources of modifiable risk for colorectal cancer, as well as part of our identity. Our style of eating is part of who we are.
For most of the 20th Century agricultural science focused on increasing food production yield and efficiency. Only recently have scientists begun to include ecological impacts of farming but now nutritionists and agricultural scientists from Tufts University and Cornell have joined together to compare the per capita land requirements and potential carrying capacity of the land base of the continental United States (U.S.) under a diverse set of dietary scenarios. In other words they have looked at how much land resources are necessary to feed a person on various eating senarios.
Unsurprisingly, certain styles of eating require more land and water resources than others. The scientists found that diet composition greatly influences overall land footprint necessary to feed a person. The baseline scenario of how the U.S. eats on average today had the highest total land use, 1.08 hectares per person per year. Land requirements decreased steadily across the five “healthy” omnivorous diets with fewer calories and decreasing animal protein, from 0.93 to 0.25 hectares per person per year, and the total land requirements for the three vegetarian diets were all very low, 0.13 to 0.14 hectares per person per year.
The scientist found that diets including some meat can feed more people than vegan diets. As the amount of meat in the diet was reduced, the amount of land necessary to grow crops to feed livestock decreased until the crossover where more land was necessary to grow enough plant protein for human consumption. The scientist found that the vegetarian diet could provide adequate nutrition and feed almost 800 million people from the currently available agriculture land. However, differences in total per capita land requirements are only part of the story the scientists were telling.
While I initially thought the data was intended to be used only to maximize the number of people that could be feed by our nation’s agricultural lands. The scientists state: “Provision of food, while essential, is not the only important ecological service provided by land. Some of these services, such as carbon capture, may be compatible with grazing, at least in well-managed systems. Other services, such as wildlife habitat, may be impinged where domesticated species compete for biomass with wild ruminants and ungulates. Finally, the use of perennial cropland for grazing or hay production could conceivably compete with bioenergy production where biomass energy or draft animals are possible alternatives to fossil fuels.”
Changing the American diet could improve health, reduce the need for agricultural land and make biomass available to meet the new U.S Environmental Protection Agency mandated increase in renewable fuel volume requirements across all categories of biofuels under the Renewable Fuel Standard (RFS) program, fight increasing CO2 levels and support a healthier ecology. I thought the study was how to best feed the growing population of mankind, but it wasn’t. This is part of the pivot towards climate under Executive Order 13653, “Preparing the United States for the Impacts of Climate Change.”
Wednesday, December 7, 2016
Haymarket Do Not Drink Tap Water Without Boiling it First
A water main break in Haymarket, Virginia near University Boulevard between U.S. 29 and Wellington Road has resulted in a loss of pressure in the water distribution system. Prince William Service Authority has issued a boil water advisory. DO NOT DRINK THE WATER WITHOUT BOILING IT FIRST. Bring all water to a rolling boil, let it boil for one minute, and cool before using; or use bottled water. You should use boil or bottled water for drinking, making ice, washing dishes, brushing teeth and food preparation until you are notified that the advisory has been lifted. Make sure to send your kids to school with bottles of water.
There are two types of boil water advisories: precautionary and mandatory. A loss of positive water pressure in the system from a water main break might allow contamination to enter the water distribution system. This is the most common type of advisory, which is issued as a precaution until water samples are collected and analyzed to confirm that water quality has not been affected. A mandatory boil water notice is issued when contamination is confirmed in the water system. Customers are instructed to boil the water to kill bacteria and other organisms in the water, until the issue is resolved and the notice can be lifted. Contamination from organisms, such as bacteria, viruses and parasites, can cause symptoms, including nausea, cramps, diarrhea and associated headaches.
After a water main break water samples must be collected to test for bacteria in the distribution system. The first samples are taken on the day when the water main break has been fixed, and then another set of samples are taken in the next 24 hours. Two consecutive days of "clean" test results are required before the water advisory can be lifted. (The process takes 24 hours for test results to come back from the laboratory, so final lab results to lift an advisory can take several days after the event.) The Prince William County Service Authority is advising that this Boil Water Notice will remain in effect for a minimum of 48 hours to provide adequate time for water quality testing. As more information becomes available, customers in the affected area will be notified.
Boiling the water kills microorganisms such as bacteria, viruses, or protozoans that can cause disease. Boiling makes the tap water safe. Adding a tablespoon of household bleach such as Clorox to a sink full of tap water should be sufficient to treat the water used for washing dishes. Bleach should also be added to the water used for rinsing dishes. Allow dishes and utensils to air dry before reuse. Throw away uncooked food or beverages or ice cubes if they were made with tap water since Tuesday afternoon
• Keep boiled water in the refrigerator for drinking
• Do not swallow water while you are showering or bathing
• Provide pets with boiled water after cooling
• Do not use home filtering devices in place of boiling or using bottled water; Most home water filters will not provide adequate protection from microorganisms
• Use only boiled water to treat minor injuries; When showering or bathing, avoid allowing the water to come in contact with an open wound
• Do not wash salad items with tap water during the period; Use bottled water or freshly boiled and cooled tap water
• Do not swallow water while you are showering or bathing
• Provide pets with boiled water after cooling
• Do not use home filtering devices in place of boiling or using bottled water; Most home water filters will not provide adequate protection from microorganisms
• Use only boiled water to treat minor injuries; When showering or bathing, avoid allowing the water to come in contact with an open wound
• Do not wash salad items with tap water during the period; Use bottled water or freshly boiled and cooled tap water
Regency at Dominion Valley
Dominion Valley Country Club
Westmarket
Simmons Grove
Market Center
Village at Heathcote
Old Carolina Road Estates
Piedmont Mews
Longstreet Commons
Heritage Farms
Long Level Acres
Long Level Estates
Piedmont
Heritage Hunt
Piedmont South
Carterwood
Crossroads Village S1
Heathcote Commons
Village Place
Hillwood Park MHP
Gainesville Mobile Homes
Lakeview
Lakeview Estates
Wentworth Green
Virginia Oaks
Waverly Mill
Villages of Piedmont
Haymarket Station
Greenhill Crossing
Kennard Ridge
Somerset
Somerset Crossing
Gates Mill Estates
Gainesville Acres
Estates at Breyerton
Breyerton
Gateway Oaks
Blue Ridge Farms
Hopewells Landing
Madison Crescent
Lake Manassas
Regents at Lake Manassas
Reserve at Lake Manassas
Monday, December 5, 2016
Alberta Pledges to Implement Canadian Carbon Tax
With Alberta joining, four of Canada’s most populous provinces either already have or are introducing some kind of carbon price or carbon cap and trade program. Last month Prime Minister Justin Trudeau announced a minimum Canadian federal price on carbon. The initial price will be a minimum of $10 (Canadian) per metric ton (“tonne”) of CO2, and it will increase annually by $10/tonne to reach $50 in 2022 when it will be reviewed. This announcement was made by the Prime Minister in leading off parliamentary debate on the Paris climate change agreement Monday, making the case for Canada to cut greenhouse gas emissions by 30% from 2005 levels by 2030. (In Parlimentary debate the Prime Minister leads and closes the debate.) The announcement came without warning.
Though Prime Minister Trudeau has long promised that Ottawa would impose a minimum carbon price on provinces unwilling to adopt their own system, but in his speech he effectively seized the leadership from the province premiers, who have insisted on the right to regulate carbon emissions as they see fit. Canada’s 10 provinces will need to meet that minimum, or exceed it, by using either a carbon tax or achieving a comparable emissions reduction through a cap-and-trade system.
The Prime Minister said he will convene a first ministers’ meeting on December 8th with the aim of concluding an all Canada climate plan, which would include carbon pricing and other measures. A first minister’ meeting is a meeting of the provincial and territorial premiers and the Prime Minister. Several provinces and territories reacted angrily. Three of the environment ministers walked out of the federal-provincial climate talks after Mr. Trudeau’s unilateral announcement in the House of Commons. Most vocal in his protest was Saskatchewan Premier Brad Wall who was reported to have called the carbon tax decision is a betrayal of previous promises and assurances and that the tax would devastate his province’s economy.
However, last Tuesday, The Trudeau government approved Kinder Morgan Canada Inc.’s $6.8-billion Trans Mountain pipeline, which would nearly triple capacity on an existing line from Edmonton to Burnaby, B.C. to about 900,000 barrels per day. This is an alternative route to the Keystone Pipeline to move the Canadian oil sands out of Canada via Canada’s Pacific coast. Now Alberta has committed to phase in a $50-per-tonne carbon tax after the federal government approved the pipeline.
In addition Canada’s westernmost province of British
Columbia already prices carbon at C$30 per metric ton. Canada’s two most
populous provinces, Ontario and Quebec, are pursuing cap-and-trade systems,
which Prime Minister Trudeau said will be deemed adequate if they achieve the
same emissions reductions as the federal minimum price would achieve.
In a carbon tax or fee as in the Canadian federal program the price on carbon pollution provides an incentive for everyone, from industry to households, to be part of the solution. A carbon tax can be very simple. It can rely on existing administrative structures for taxing fuels and can therefore be implemented in just a few months. Ultimately, the critical factor in reducing carbon emissions is the strength of the economic signal. A stronger carbon price will push more growth low carbon, renewable energy and will encourage adoption of greener practices.
While a cap-and-trade system could achieve the same goals, but in practice they tend to be much more complex. More time is required to develop the necessary regulations, and they are more susceptible to lobbying and loopholes. Cap-and-trade also requires the establishment of an emissions trading market and they become expensive to operate and monitor. I support a carbon tax, and look forward to see how the system ends up operating in Canada. The U.S. based Carbon Tax Center says that their computer model suggests that a U.S. carbon tax at that rate would reduce CO2 emissions by 12-13 % below “otherwise” emissions (without a carbon price) in 2022 in the United States.
Though Prime Minister Trudeau has long promised that Ottawa would impose a minimum carbon price on provinces unwilling to adopt their own system, but in his speech he effectively seized the leadership from the province premiers, who have insisted on the right to regulate carbon emissions as they see fit. Canada’s 10 provinces will need to meet that minimum, or exceed it, by using either a carbon tax or achieving a comparable emissions reduction through a cap-and-trade system.
The Prime Minister said he will convene a first ministers’ meeting on December 8th with the aim of concluding an all Canada climate plan, which would include carbon pricing and other measures. A first minister’ meeting is a meeting of the provincial and territorial premiers and the Prime Minister. Several provinces and territories reacted angrily. Three of the environment ministers walked out of the federal-provincial climate talks after Mr. Trudeau’s unilateral announcement in the House of Commons. Most vocal in his protest was Saskatchewan Premier Brad Wall who was reported to have called the carbon tax decision is a betrayal of previous promises and assurances and that the tax would devastate his province’s economy.
However, last Tuesday, The Trudeau government approved Kinder Morgan Canada Inc.’s $6.8-billion Trans Mountain pipeline, which would nearly triple capacity on an existing line from Edmonton to Burnaby, B.C. to about 900,000 barrels per day. This is an alternative route to the Keystone Pipeline to move the Canadian oil sands out of Canada via Canada’s Pacific coast. Now Alberta has committed to phase in a $50-per-tonne carbon tax after the federal government approved the pipeline.
In a carbon tax or fee as in the Canadian federal program the price on carbon pollution provides an incentive for everyone, from industry to households, to be part of the solution. A carbon tax can be very simple. It can rely on existing administrative structures for taxing fuels and can therefore be implemented in just a few months. Ultimately, the critical factor in reducing carbon emissions is the strength of the economic signal. A stronger carbon price will push more growth low carbon, renewable energy and will encourage adoption of greener practices.
While a cap-and-trade system could achieve the same goals, but in practice they tend to be much more complex. More time is required to develop the necessary regulations, and they are more susceptible to lobbying and loopholes. Cap-and-trade also requires the establishment of an emissions trading market and they become expensive to operate and monitor. I support a carbon tax, and look forward to see how the system ends up operating in Canada. The U.S. based Carbon Tax Center says that their computer model suggests that a U.S. carbon tax at that rate would reduce CO2 emissions by 12-13 % below “otherwise” emissions (without a carbon price) in 2022 in the United States.
Thursday, December 1, 2016
Who Owns the Waters of the Potomac?
Proctor & Gamble (P&G) plans to build a $500 million manufacturing plant to produce Pantene shampoos and Old Spice body wash just south of Martinsburg, West Virginia. Under a previous agreement the local water treatment plant is authorized to draw four million gallons a day from the Potomac River. The water treatment plant currently draws 2.4 million and the P&G plant will require an additional 1.3 million gallon a day; however, the P&G factory will result in additional residential and industrial development that will push the county over the four million gallons a day it is currently limited to. In a letter to Maryland officials, West Virginia Attorney General Patrick Morrisey said the water treatment facility that will supply water to the P&G factory “has an urgent need” to increase capacity beyond limits imposed by Maryland and has threatened to file suit much as Virginia did at the turn of the 21st Century.
West Virginia has a strong case. Maryland instituted a permitting program for waterway construction and water withdrawal on the Potomac River in 1933. Over the years, starting in 1957, Maryland issued many of these permits to various Virginia and West Virginia entities without objection. In 1996, the predecessor agency to Fairfax Water applied for permits to build a 725-foot water intake structure to supply the new James J. Corbalis Jr. Water Treatment Plant in Herndon. Maryland denied the permit.
After failing to obtain administrative remedies, Virginia filed a complaint with the U.S. Supreme Court in March 2000 seeking a declaration that Maryland lacked regulatory authority to veto the project. (In 2001, Maryland approved the permit, but with the condition that FCWA install a restrictor to limit withdrawal.) The Court accepted that cast and referred the complaint to a Special Master for fact-finding and recommendation. The Special Master reviewed the evidence submitted by both states and recommended that the Court rule for Virginia. In the Special Master’s opinion, Maryland did not have authority to regulate Virginia’s rights under the 1785 compact and 1877 award.
Maryland opposed this recommendation on two grounds: first, that as sovereign over the river to the Virginia border it had regulatory authority; and second, that even if Virginia’s rights under the compact and award were unrestricted, Maryland had acquired the right to regulate by way of Virginia’s acquiescence to its regulation since 1957.
Nonetheless, in a 7-2 majority the Supreme Court overruled Maryland’s objections to the Special Master’s report, and entered the decree proposed by the Special Master affirming Virginia’s sovereign rights under the 1877 arbitration award to build structures appurtenant to its shore and withdraw water from the Potomac River without regulation by Maryland. Now West Virginia has threatened to challenge Maryland’s authority over its withdrawals.
The effect of the Virginia case on the West Virginia water disputes may be limited to most other states, the decision turned on interpretation of historical documents unique to the previous case – the 1785 compact and 1877 arbitration award. However, when Virginia attempted to secede from the Union in 1861 during the Civil Water, voters in 41 northwestern counties of Virginia (including Preston County) voted to secede from Virginia; and in 1863 the new state of West Virginia was admitted into the union with the United States. The historical claims of Virginia are the same as those of West Virginia.
Though the Potomac River, although practically serving as the border between West Virginia and Maryland is not the actual boundary. The Maryland Assembly passed legislation in April 1787 to formally establish the boundary. Francis Deakins was appointed surveyor, and in 1788 established what became known as the "Deakins line." The Deakins line became the de facto border of Maryland. Unfortunately, the Deakins line was not straight, and it was not a true meridian but rather drifted to the east. A second attempt was made to correct this error, but the dispute continued. Finally, in a 9-to-0 ruling by the United States Supreme Court in 1910 found that the boundary between Maryland and West Virginia is the south bank of the Potomac River and the Deakins line became the de facto border of Maryland.
West Virginia has a strong case. Maryland instituted a permitting program for waterway construction and water withdrawal on the Potomac River in 1933. Over the years, starting in 1957, Maryland issued many of these permits to various Virginia and West Virginia entities without objection. In 1996, the predecessor agency to Fairfax Water applied for permits to build a 725-foot water intake structure to supply the new James J. Corbalis Jr. Water Treatment Plant in Herndon. Maryland denied the permit.
After failing to obtain administrative remedies, Virginia filed a complaint with the U.S. Supreme Court in March 2000 seeking a declaration that Maryland lacked regulatory authority to veto the project. (In 2001, Maryland approved the permit, but with the condition that FCWA install a restrictor to limit withdrawal.) The Court accepted that cast and referred the complaint to a Special Master for fact-finding and recommendation. The Special Master reviewed the evidence submitted by both states and recommended that the Court rule for Virginia. In the Special Master’s opinion, Maryland did not have authority to regulate Virginia’s rights under the 1785 compact and 1877 award.
Maryland opposed this recommendation on two grounds: first, that as sovereign over the river to the Virginia border it had regulatory authority; and second, that even if Virginia’s rights under the compact and award were unrestricted, Maryland had acquired the right to regulate by way of Virginia’s acquiescence to its regulation since 1957.
Nonetheless, in a 7-2 majority the Supreme Court overruled Maryland’s objections to the Special Master’s report, and entered the decree proposed by the Special Master affirming Virginia’s sovereign rights under the 1877 arbitration award to build structures appurtenant to its shore and withdraw water from the Potomac River without regulation by Maryland. Now West Virginia has threatened to challenge Maryland’s authority over its withdrawals.
The effect of the Virginia case on the West Virginia water disputes may be limited to most other states, the decision turned on interpretation of historical documents unique to the previous case – the 1785 compact and 1877 arbitration award. However, when Virginia attempted to secede from the Union in 1861 during the Civil Water, voters in 41 northwestern counties of Virginia (including Preston County) voted to secede from Virginia; and in 1863 the new state of West Virginia was admitted into the union with the United States. The historical claims of Virginia are the same as those of West Virginia.
Though the Potomac River, although practically serving as the border between West Virginia and Maryland is not the actual boundary. The Maryland Assembly passed legislation in April 1787 to formally establish the boundary. Francis Deakins was appointed surveyor, and in 1788 established what became known as the "Deakins line." The Deakins line became the de facto border of Maryland. Unfortunately, the Deakins line was not straight, and it was not a true meridian but rather drifted to the east. A second attempt was made to correct this error, but the dispute continued. Finally, in a 9-to-0 ruling by the United States Supreme Court in 1910 found that the boundary between Maryland and West Virginia is the south bank of the Potomac River and the Deakins line became the de facto border of Maryland.
Monday, November 28, 2016
LWV and Fracking in Virginia
The Virginia League of Women Voter's report was well-researched and an impartial review of the science, regulation and current status of Fracking in Virginia. Rona Ackerman of Fairfax gave an excellent and engaging presentation of the report and lead the discussion. Though I encourage you to read the report for an unbiased review of the technology and what we know about fracking, the most important take away was the status of fracking and fracking regulations in Virginia.
from DMME |
Virginia has gas rich shale deposits. The U.S. Geological Survey estimated that the Taylorsville basin contains over a trillion cubic feet of gas. The Taylorsville basin has not been explored using newer fracking techniques so it is not known if we have the technology to exploit these deposits, yet. However, over 84,000 acres in the Taylorsville basin have been leased for 7 years by Shore Drilling.
The oldest type of hydraulic fracturing is coal bed formation fracturing that has been used for more than 65 years. The volume of water needed for hydraulic fracturing varies by site and type of formation. Fifty thousand to 350,000 gallons of water may be required to fracture one well in a coal bed formation while two to five million gallons of water injected at much higher pressure may be necessary to fracture one horizontal well in a shale formation. Virginia currently only has gas well in the coal rich Appalachian Plateau 6,000 of 8,400 existing wells were dry fracked. The existing wells are vertical wells that were nitrogen fracked. This is a completely different technology than contemplated for the Taylorsville shale deposit.
from DMME |
In 2013 then Virginia Attorney General Ken Cuccinelli issued an opinion that stated “a local governing body cannot ban altogether the exploration for, and the drilling of, oil and natural gas within the locality’s boundaries.” However, in May 2015 current Virginia Attorney General Mark Herring issued an opinion that stated “Localities may use their zoning authority to prohibit “unconventional gas and oil drilling,” commonly known as fracking.” Following this opinion the King George Board of Supervisors in the Taylorsville basin voted to amend their zoning ordinance and Comprehensive Plan, prohibiting drilling within 750 feet from resource protected areas, such as rivers and creeks, as well as roads, buildings and schools, leaving only 9% of the county potentially eligible for drilling.
In 2015 the Virginia Department of Mines Minerals and Energy (DMME) promulgated New Gas and Oil Regulations. In summary the regulations would:
(i)
amend permit application requirements to include
disclosure of all ingredients anticipated to be used in hydraulic fracturing
operations, certification that the proposed operation complies with local land
use ordinances, inclusion of a groundwater sampling and monitoring plan, and
submission of an emergency response plan;
(ii)
require a pre-application meeting jointly
conducted by the DMME and the Department of Environmental Quality before an
operator drills for gas or oil in Tidewater Virginia;
(iii)
require well operators to use FracFocus, the
national hydraulic fracturing chemical registry website, to disclose the
chemicals used in hydraulic fracturing operations;
(iv)
establish a groundwater sampling, analysis, and
monitoring program before and after well construction;
(v)
add language related to the use of centralizers
in the water protection string of the casing;
(vi)
strengthen casing and pressure testing
requirements for well casings used in conventional and coalbed methane gas
wells; and
(vii)
provide protection for trade secrets from public
dissemination while allowing this information to be made available to first
responders and local officials in the event of an emergency.”
The new Gas and Oil Regulations were submitted for final approval by Governor Terry McAuliffe last August. It is still waiting for approval and there is no timeline, but approval is expected this year. The Gas industry has been trying to delay the regulations in Virginia so that a bill tabled from last year to exempt the Gas industry from Freedom of Information Act requirements for fracking chemicals. This bill, HB1389, was carried over from last year and should not pass. It is important not only for first responders, but for citizens to know what chemicals they are potentially being exposed to. From data from FracFocus we know that 29 known or possible human carcinogens regulated under the Safe Drinking Water Act, or listed as hazardous air pollutants were used in 650 out of 2500 fracking products. Unless you know what chemicals to look for, it is virtually impossible to test air and water pathways for every possible contaminant. Please consider calling your delegate to vote against HV 1389 this year.
Thursday, November 24, 2016
Water Delivery for the Caribbean
While I have spent the late summer watching a silent drought take over my corner of Virginia, other parts of the world are experiencing much bigger droughts. The islands of the Caribbean have been experiencing drought. Their drought started early last year. The Islands are mostly dry rock formations that collect rainfall in reservoirs across the region. Without the rains, the reservoirs are being drained, forcing utilities from Trinidad & Tobago to Jamaica to ration water.
For some islands, such as Cuba, it is reported to be the worst drought in more than 100 years. And this may just be the start. Now the tiny Republic of Suriname wants to sell some of their abundant water to their neighbors. Suriname is located on the coast of South America and has a reported 151 billion M³ of fresh water flow to the ocean each year from its rivers.
Now, a company, Amazone Resources has received the rights from Suriname’s government to pump water from the mouths of the Coppename and Suriname rivers, both of which meet World Health Organization standards for water quality. The water will be filtered and treated with UV light to meet health standards. This week, a boat will tow a giant bag made from PVC-coated fabric with enough water to fill an Olympic-size swimming pool from Suriname to drought-stricken Barbados and Curacao. The bag will float because fresh water is lighter than salt water.
Amazone Resouces has received permission to export up to 400 flex tanks a year. This is equivalent to 0.0092% of the flow of the rivers. Research has shown that removal of up to 0.129% of a river's flow can be accomplished without permanently disturbing the ecology. This will be a test run for a business to sell some of the excess water that flows to the sea from Suriname without disturbing the ecological balance. The Barbados Water Authority, which signed a memorandum of understanding for the test run but is not buying the initial shipment, said in a statement that the accord it part of its long-term plans to tackle the impact of climate change.
The total volume of water on Earth is about 1,400 million km3 of which only 2.5 %, or about 35 million km3, is freshwater. Most freshwater occurs in the form of permanent ice or snow, locked up in Antarctica and Greenland, or in deep groundwater aquifers. The principal sources of water for human use are lakes, rivers, soil moisture and relatively shallow groundwater basins. The usable portion of these sources is only about 200,000 km3 of water worldwide.
Freshwater resources are unevenly distributed, with much of the water located far from human populations. Many of the world's largest river basins run through thinly populated regions. At the continental level, the Americas has the largest share of the world’s total freshwater resources with 45%, followed by Asia with 28%, Europe with 15.5 % and Africa with 9%.
Fresh water I necessary to sustain life, but it is equally vital for food production. Seventy percent of the worldʼs fresh water resource is currently required for food production alone, yet water is also essential for industry. Every product on the planet has been produced by using water at some stage of the process. Thirty-three countries depend on other countries for over 50% of their renewable water resources: Argentina, Azerbaijan, Bahrain, Bangladesh, Benin, Bolivia, Botswana, Cambodia, Chad, Congo, Djibouti, Egypt, Eritrea, Gambia, Iraq, Israel, Kuwait, Latvia, Mauritania, Mozambique, Namibia, Netherlands, Niger, Pakistan, Paraguay, Portugal, Republic of Moldova, Romania, Senegal, Somalia, Sudan, Syrian Arab Republic, Turkmenistan, Ukraine, Uruguay, Uzbekistan, Viet Nam and Yugoslavia.
For some islands, such as Cuba, it is reported to be the worst drought in more than 100 years. And this may just be the start. Now the tiny Republic of Suriname wants to sell some of their abundant water to their neighbors. Suriname is located on the coast of South America and has a reported 151 billion M³ of fresh water flow to the ocean each year from its rivers.
Now, a company, Amazone Resources has received the rights from Suriname’s government to pump water from the mouths of the Coppename and Suriname rivers, both of which meet World Health Organization standards for water quality. The water will be filtered and treated with UV light to meet health standards. This week, a boat will tow a giant bag made from PVC-coated fabric with enough water to fill an Olympic-size swimming pool from Suriname to drought-stricken Barbados and Curacao. The bag will float because fresh water is lighter than salt water.
Amazone Resouces has received permission to export up to 400 flex tanks a year. This is equivalent to 0.0092% of the flow of the rivers. Research has shown that removal of up to 0.129% of a river's flow can be accomplished without permanently disturbing the ecology. This will be a test run for a business to sell some of the excess water that flows to the sea from Suriname without disturbing the ecological balance. The Barbados Water Authority, which signed a memorandum of understanding for the test run but is not buying the initial shipment, said in a statement that the accord it part of its long-term plans to tackle the impact of climate change.
The total volume of water on Earth is about 1,400 million km3 of which only 2.5 %, or about 35 million km3, is freshwater. Most freshwater occurs in the form of permanent ice or snow, locked up in Antarctica and Greenland, or in deep groundwater aquifers. The principal sources of water for human use are lakes, rivers, soil moisture and relatively shallow groundwater basins. The usable portion of these sources is only about 200,000 km3 of water worldwide.
Freshwater resources are unevenly distributed, with much of the water located far from human populations. Many of the world's largest river basins run through thinly populated regions. At the continental level, the Americas has the largest share of the world’s total freshwater resources with 45%, followed by Asia with 28%, Europe with 15.5 % and Africa with 9%.
Fresh water I necessary to sustain life, but it is equally vital for food production. Seventy percent of the worldʼs fresh water resource is currently required for food production alone, yet water is also essential for industry. Every product on the planet has been produced by using water at some stage of the process. Thirty-three countries depend on other countries for over 50% of their renewable water resources: Argentina, Azerbaijan, Bahrain, Bangladesh, Benin, Bolivia, Botswana, Cambodia, Chad, Congo, Djibouti, Egypt, Eritrea, Gambia, Iraq, Israel, Kuwait, Latvia, Mauritania, Mozambique, Namibia, Netherlands, Niger, Pakistan, Paraguay, Portugal, Republic of Moldova, Romania, Senegal, Somalia, Sudan, Syrian Arab Republic, Turkmenistan, Ukraine, Uruguay, Uzbekistan, Viet Nam and Yugoslavia.
Monday, November 21, 2016
We Trash Food While Americans Go Hungry
With Thanksgiving just around the corner, we should talk about wasted food in America. The U.S. Environmental Protection Agency (EPA) tells us that more food is sent to landfills and incinerators than any other single material in the United States. The EPA estimates that 35.04 metric tons of prepared food or consumer bought food is wasted each year. The U.S. Department of Agriculture (USDA) estimates that throughout the food chain between 30%-40% percent of the total food supply or about 133 billion pounds of food worth almost $162 billion is wasted from farm to consumer.
The total amount of waste in the United States is shocking. We have to do something about this. This wasted food is particularly disturbing when you consider that in 2015, 13% of households (15.8 million) were food insecure. That means that in the United States 42.2 million Americans lived in “food insecure” households. The U. S. Department of Agriculture defines food insecurity as not having consistent access to adequate food throughout the year. This is usually caused by poverty. People who are food insecure are simply hungry, or at risk of hunger. In the United States people go hungry every day. There are hungry people in every state and community in America, our community is no exception.
Keeping food in our communities and out of landfills helps communities reduce hunger and reducing food waste also potentially reduces methane emissions from our landfills. Food waste quickly generates methane in landfills, and 20% of total U.S. methane emissions come from landfills. In addition, the land, water, labor, energy used in producing, processing, transporting, preparing, storing, and disposing of the discarded food are wasted as we throw away the imperfect and the excess.
In 2013 the USDA and EPA first called on organizations across the food chain – farms, agricultural processors, food manufacturers, grocery stores, restaurants, universities, schools, and local governments – to join efforts to
- Reduce food waste by improving product development, storage, shopping/ordering, marketing, labeling, and cooking methods.
- Recover food waste by connecting potential food donors to food banks and pantries.
- Recycle food waste to feed animals or to create compost, bioenergy and natural fertilizers.
Then in 2015 USDA and EPA announced the first U.S. food loss and waste reduction goal Challenge. Last week the USDA and EPA announced the inaugural group of the U.S. “Food Loss and Waste 2030 Champions,” businesses and organizations who have taken up the challenge and pledged to reduce food loss and waste in their operations 50% by 2030. The “Champions” announced last week were: Ahold USA, Blue Apron, Bon Appétit Management Company, Campbell Soup Company, Conagra Brands, Delhaize America, General Mills, Kellogg Company, PepsiCo, Sodexo, Unilever, Walmart, Wegman’s Food Markets, Weis Markets and YUM! Brands.
By joining the U.S. Food Waste Challenge, organizations and businesses demonstrate their commitment to reducing food waste, helping to feed the hungry in their communities, and reducing the environmental impact of wasted food. The Challenge Partners’ inventory of activities will help disseminate information about the best practices to reduce, recover, and recycle food waste and stimulate the development of more of these practices that can be applied to businesses in the future.
It is important to remember that cutting food waste will require a sustained commitment from everyone. The USDA estimates that about 90 billion pounds of food waste comes from consumers, and costs about $370 per person per year. USDA’s “Let’s Talk Trash” focuses on consumer education, highlighting key data and action steps consumers can take to reduce food waste. Take a look at this link to see the suggestions. This is much harder because it involves millions of households changing their behavior, better managing their food shopping, storage an meal planning and using and eating leftovers. Millions of our households need to continually practice frugality in our food use. This in a nation that in 2014 produced about 258 million tons of Municipal Solid Waste with only slightly over one third of trash was recycled and composted. We have been promoting recycling since 1965 with the introduction of the recycling symbol yet we still have a long way to go.
Thursday, November 17, 2016
Lorton Quarry to Become Reservoir
At the end of October William Duke, President of Vulcan Materials Mideast Division, and Philip Allin, Chairman of Fairfax Water signed an agreement at a ceremony at the Griffith Water Treatment Plant in Lorton that sets the conditions for the transformation of a rock quarry into a water storage reservoir in southeastern Fairfax County.
from Google Maps |
The Quarry will be converted to a reservoir in phases and continue to operate during phase 1 which will convert a portion of the quarry to a reservoir with storage of of about 1.8 billion gallons by 2035. Quarry operations will end with Phase II which will convert the remaining area to Fairfax Water reservoir with storage capacity of up to 15 billion gallons by 2085. To do this the existing quarry will be reconfigured to mine portions of Fairfax Water’s property. This will allow Vulcan to leave a “rock wall” that will segregate the quarry into the two parts. The two-reservoir Quarry reconfiguration addresses the water supply need projected to occur in the 2035-2040 timeframe.
from Fairfax Water |
The Vulcan Quarry was identified as the favored alternative for meeting future water needs in the Northern Virginia Regional Water Supply Plan in 2011 and adopted by Fairfax County in early 2012. This new reservoir will be used to supplement water supply to accommodate population growth in Northern Virginia and ensure that Fairfax Water can continue to provide reliable, high-quality drinking water well into the future.
Fairfax Water projects water need based on the most recent population and employee projections available from the Metropolitan Washington Council of Governments. Today, Fairfax Water serves nearly 2 million residents and more than 800,000 employees in Northern Virginia. Between 2010 and 2040, the population served by Fairfax Water, including wholesale customers, other communities that buy their water from Fairfax Water, is projected to increase by over 650,000 residents and nearly 550,000 employees. Fairfax Water needs to plan to reliably provide water to all.
All the regional water supply companies share the water resources of the Potomac. Fairfax Water, the Washington Aqueduct (WA) of the U.S. Army Corps of Engineers, and the Washington Suburban Sanitary Commission (WSSC), and the Interstate Commission on the Potomac River Basin (ICPRB) signed the Water Supply Coordination Agreement that established a framework for water supply planning, drought management, and resource optimization on the Potomac River back in 1982 and have worked together to manage the regional water resources since.
Every five years, the ICPRB conducts a study of projected demand and available water supply resources based on the best available information at the time- utilizes water use and demographic data along with assumptions regarding changes in water use patterns in the region. These are not certain. The ICPRB 2015 report assumes daily per capita water use will decrease by an additional 25%, incorporates various climate and weather scenarios and uses the projection of population growth provided by the Washington Council of Governments who forecast that the residential population is expected to grow by 23% and the workforce is expected to grow by 36% by 2040. They also looked carefully at the impact the climate change might have on water supply.
Historically, a key assumption was that the future flow of the Potomac River will mirror the hydraulic conditions for the past 79 years. If hydraulic conditions are changing or a 79 year period is inadequate to predict the possible extent of droughts, this could impact the availability of water. So, a couple of years ago the ICPRB engaged a study that created a model for various climate scenarios of water supply availability from Potomac Watershed to determine if the water supply would be adequate to serve the population. They used this model to examine the water supply adequacy of the current study.
The ICPRB found that the existing water supplies can meet demands of the forecasted population levels through the Year 2035, by implementing mandatory water restrictions during severe droughts. However, as the population and water demand continue to grow the current supply system including the Potomac River and all current and planned reservoirs and water storage would not be adequate to supply all needs during a severe drought even after using all the reservoirs to supplement flow and implementing water use restrictions.
This is why Fairfax Water has worked with Vulcan to develop the “two reservoir Quarry reconfiguration”, to provide interim water supply storage in 2035, as well as a significantly larger storage facility beyond 2085. With the delivery of the Northern Reservoir in 2035, Fairfax Water will be able to expand the Griffith Plant to 160 million gallons a day. The Northern Reservoir will also provide an emergency source of supply to the Griffith water treatment plant when emergencies like chemical spills restrict the larger and newer Corbalis plant’s access to the Potomac River. This happened last year when Fairfax Water had to shut their intake to let a plume of contamination pass.
All the regional water supply companies share the water resources of the Potomac. Fairfax Water, the Washington Aqueduct (WA) of the U.S. Army Corps of Engineers, and the Washington Suburban Sanitary Commission (WSSC), and the Interstate Commission on the Potomac River Basin (ICPRB) signed the Water Supply Coordination Agreement that established a framework for water supply planning, drought management, and resource optimization on the Potomac River back in 1982 and have worked together to manage the regional water resources since.
Every five years, the ICPRB conducts a study of projected demand and available water supply resources based on the best available information at the time- utilizes water use and demographic data along with assumptions regarding changes in water use patterns in the region. These are not certain. The ICPRB 2015 report assumes daily per capita water use will decrease by an additional 25%, incorporates various climate and weather scenarios and uses the projection of population growth provided by the Washington Council of Governments who forecast that the residential population is expected to grow by 23% and the workforce is expected to grow by 36% by 2040. They also looked carefully at the impact the climate change might have on water supply.
Historically, a key assumption was that the future flow of the Potomac River will mirror the hydraulic conditions for the past 79 years. If hydraulic conditions are changing or a 79 year period is inadequate to predict the possible extent of droughts, this could impact the availability of water. So, a couple of years ago the ICPRB engaged a study that created a model for various climate scenarios of water supply availability from Potomac Watershed to determine if the water supply would be adequate to serve the population. They used this model to examine the water supply adequacy of the current study.
The ICPRB found that the existing water supplies can meet demands of the forecasted population levels through the Year 2035, by implementing mandatory water restrictions during severe droughts. However, as the population and water demand continue to grow the current supply system including the Potomac River and all current and planned reservoirs and water storage would not be adequate to supply all needs during a severe drought even after using all the reservoirs to supplement flow and implementing water use restrictions.
This is why Fairfax Water has worked with Vulcan to develop the “two reservoir Quarry reconfiguration”, to provide interim water supply storage in 2035, as well as a significantly larger storage facility beyond 2085. With the delivery of the Northern Reservoir in 2035, Fairfax Water will be able to expand the Griffith Plant to 160 million gallons a day. The Northern Reservoir will also provide an emergency source of supply to the Griffith water treatment plant when emergencies like chemical spills restrict the larger and newer Corbalis plant’s access to the Potomac River. This happened last year when Fairfax Water had to shut their intake to let a plume of contamination pass.
Monday, November 14, 2016
Compare Air Pollution in New Delhi to Your Home
from US Embassy Feed |
Last week the New Delhi smog was reported to be its worst in 17 years. The Government closed all of the city's more than 5,000 schools for three days to minimize the risk for children. In a city of over 17 million that meant that an estimated 4.41 million children missed three days of school according to the United Nations Children's Fund. The Air Quality index as measured by the PM2.5 monitoring station atop the US Embassy in New Delhi reported particulate pollution levels had fallen first to 400 then to 258 over the weekend, but the air quality is forecast to worsen again later this week.
Add caption |
PM2.5 particles are a major contributing factor to lung disease. A study of children in Southern California showed lung damage associated with long-term particulate exposure, and a multi-city study found decreased lung function in children associated with long term particulate exposure. The United States particulate levels are a small fraction of the levels in the worst areas of the world-Beijing, New Delhi, Santiago (Chile), Mexico City, Ulaanbaatar (Mongolia), Cairo (Egypt), Chongqing (China), Guangzhou (China), Hong Kong, and Kabul (Afghanistan).
PM2.5 particles can be either directly emitted or formed via atmospheric reactions. Primary particles are emitted from cars, trucks, and heavy equipment, as well as residential wood combustion, forest fires, and agricultural waste burning as is still common in India. The main components of particulate pollution formed when pollutants like NOx and SO2 react in the atmosphere to form particles. These particles are emitted from coal fired power plants and other combustion engines. The increase in automobiles and coal fired power plants in both India and China has exacerbated this problem in India, China and other areas of the world because particulates can travel great distances.
So, as the Indians and Chinese spew more and more pollutants and particulates which are most concentrated in their own cities but are worsening in many cities around the world. As the Indians and Chinese expand their air pollution, the United States continues to reduce ours. If you want to take a look at real time particulate pollution levels you can see what the monitors nearest your home are reporting. Long Park in Haymarket Virginia was reporting an AQI level of 2 as I was finishing this article. Long Park is about 3 miles from my house down route 15.
Thursday, November 10, 2016
5 Second Rule is Bunk
Though it was “busted” by the Myth Busters years back when they were still on the air, researchers at Rutgers University School of Environmental and Biological Sciences (extension), Professor Donald Schaffner and his graduate student Robyn Miranda recently put the “5 second rule” to rigorous scientific test. The “5 second rule” is the popular notion that food dropped on the floor, but picked up quickly, is safe to eat because bacteria need time to transfer.
Bacterial cross-contamination from food coming into contact with surfaces can contribute to foodborne disease. The cross-contamination rate of Enterobacter aerogenes was used as a proxy for disease carrying bacteria and measured on household surfaces of stainless steel, tile, wood and carpet. The food types were watermelon, bread, bread with butter and gummy candy. The transfer times tested were under 1 second, 5 second, 30 second and 300 seconds. Transfer scenarios were evaluated for each surface type, food type, contact time and bacterial prep; surfaces were inoculated with bacteria and allowed to completely dry before food samples were dropped and left to remain for specified periods
What the researchers found was that bacteria can transfer essentially immediately on contact with food that is dropped; and the wetter the food, the higher the risk of transfer― watermelon had the most contamination, gummy candy the least. Also, longer food contact times usually result in the transfer of more bacteria from each surface to food
All totaled 128 scenarios were replicated 20 times each, yielding 2,560 measurements. Post-transfer surface and food samples were analyzed for contamination. The researchers concluded that the longer food was in contact with a surface the in more bacterial transferred, they also found that other factors are to be considered such as the nature of the food and the surface it falls on, are of equal or greater importance. Surprisingly, they found that carpet has very low transfer rates compared with those of tile and stainless steel, the transfer rate from wood is more variable. “The topography of the surface and food seem to play an important role in bacterial transfer,” Dr. Schaffner said. “Bacteria don’t have legs, they move with the moisture, and the wetter the food, the higher the risk of transfer. Also, longer food contact times usually result in the transfer of more bacteria from each surface to food.”
The bottom line is that contamination can transfer almost immediately. It is essential to clean your working surfaces and floors regularly to prevent cross-contamination of food on your counters, and don’t eat whatever it was that fell on the floor, throw it out. Donald Schaffner is a professor and extension specialist in food science at the School of Environmental and Biological Sciences, Rutgers University-New Brunswick. Robyn Miranda is a graduate student in his laboratory there. Their study appears online in the American Society for Microbiology’s journal, Applied and Environmental Microbiology (only the abstract is free).
Bacterial cross-contamination from food coming into contact with surfaces can contribute to foodborne disease. The cross-contamination rate of Enterobacter aerogenes was used as a proxy for disease carrying bacteria and measured on household surfaces of stainless steel, tile, wood and carpet. The food types were watermelon, bread, bread with butter and gummy candy. The transfer times tested were under 1 second, 5 second, 30 second and 300 seconds. Transfer scenarios were evaluated for each surface type, food type, contact time and bacterial prep; surfaces were inoculated with bacteria and allowed to completely dry before food samples were dropped and left to remain for specified periods
What the researchers found was that bacteria can transfer essentially immediately on contact with food that is dropped; and the wetter the food, the higher the risk of transfer― watermelon had the most contamination, gummy candy the least. Also, longer food contact times usually result in the transfer of more bacteria from each surface to food
All totaled 128 scenarios were replicated 20 times each, yielding 2,560 measurements. Post-transfer surface and food samples were analyzed for contamination. The researchers concluded that the longer food was in contact with a surface the in more bacterial transferred, they also found that other factors are to be considered such as the nature of the food and the surface it falls on, are of equal or greater importance. Surprisingly, they found that carpet has very low transfer rates compared with those of tile and stainless steel, the transfer rate from wood is more variable. “The topography of the surface and food seem to play an important role in bacterial transfer,” Dr. Schaffner said. “Bacteria don’t have legs, they move with the moisture, and the wetter the food, the higher the risk of transfer. Also, longer food contact times usually result in the transfer of more bacteria from each surface to food.”
The bottom line is that contamination can transfer almost immediately. It is essential to clean your working surfaces and floors regularly to prevent cross-contamination of food on your counters, and don’t eat whatever it was that fell on the floor, throw it out. Donald Schaffner is a professor and extension specialist in food science at the School of Environmental and Biological Sciences, Rutgers University-New Brunswick. Robyn Miranda is a graduate student in his laboratory there. Their study appears online in the American Society for Microbiology’s journal, Applied and Environmental Microbiology (only the abstract is free).
Monday, November 7, 2016
You Can View High Resolution Land Use Data
For the last several years the Chesapeake Bay Program Office of the U.S. Environmental Protection Agency (EPA) has worked with local governments and their partners in all 206 counties within the Chesapeake Bay watershed across six states and the District of Columbia. All the Chesapeake Bay watershed counties and major municipalities have gathered together information on local land cover, land use, parcel and zoning data and converted it to a consistent format so that it could be accessed and used by the EPA.
Thanks to the hard work of groups like our own Prince William Soil and Water Conservation District and the Prince William County Government local land use data was collected from over 80% of counties. In parallel with these activities, the counties and municipalities funded the development of new high-resolution data on land cover—such as impervious surfaces, tree cover and water—for the entire watershed. This work was carried out by the Chesapeake Conservancy, the University of Vermont and World View Solutions, mapped out land cover across more than 80,000 square miles at a one-square-meter resolution. This is amazing resolution. This land cover data was then combined with the information provided by the various local governments and agencies to produce a detailed land use dataset for each county.
Now the EPA has used the high-resolution mapping of land use to update and improve the EPA Chesapeake Bay Program’s Chesapeake Bay Watershed Model, used to measure success against the EPA mandated Chesapeake Bay restoration activities and support local, state and regional decision making across the region. The latest version of this model, Phase 6, is currently under review. The EPA mandated a contamination limit for nutrient contamination and sediment to all the states in the Chesapeake Bay Watershed and Washington DC. The EPA set a total limit for the entire watershed of 185.9 million pounds of nitrogen, 12.5 million pounds of phosphorus and 6.45 billion pounds of sediment per year which was a 25% reduction in nitrogen, 24% reduction in phosphorus and 20 % reduction in sediment from the 2011 levels. The pollution limits were then partitioned to the various states and river basins and counties based on the Chesapeake Bay computer model and monitoring data. The problem with the first versions of the model was the land use data and impervious ground cover data was not consistent across different parts of the model.
The pollution limits were created by a series of models of the Chesapeake Bay Watershed. These computer models are mathematical representations of the real world that estimate environmental events and conditions. The models are at best imperfect, but they are nonetheless the best tool available to view the 80,000 square miles of the watershed. The Chesapeake Bay and its watershed are so large and complex, that scientists and regulators rely on computer models for critical information about the ecosystem’s characteristics and health, then use the model to assess the impact of various environmental mitigations to reduce pollution.
The earlier versions of the model had used approximately 675,917 acres for the impervious surface data and 1,885,915 acres for the pervious surface data in Virginia. A review of the EPA’s own data found that there were 1,569,377 impervious acres and 3,442,346 pervious acres in the urban areas in the Virginia segments of the model that includes all the paved and landscaped areas of suburbia. Between the 1990 census and the 2010 census when the model was developed the population of Virginia grew from 6.2 million people to 8.0 million people. The bulk of that growth took place in the urban and suburban centers of the Chesapeake Bay watershed. This served to distort the model and resulted in the under reporting of pollution from impervious surfaces (like roads, buildings and parking lots). Pollutions loads for nitrogen, phosphorus and sediment in the urban areas are calculated using a constant pounds/acre/year for impervious acres as a fixed input, and the pervious load is based on total fertilizer sales data less the impervious load. The under reported acres of housing and roadways distorted the agricultural contribution to the pollution by increasing it. Hopefully in the latest version of the model, that has now been corrected.
The datasets will be made available free-of-charge to local governments and the public over the next month or so. In addition, local governments will make available the data they collected on past land cover and land use over the last 30 years, as well as map overlays with geographic coverages of federal lands, sewer service areas, regulated stormwater areas and combined sewer overflow areas within each county. At the moment, Prince William and the rest of the Virginia counties are all still pending, but should be ready for viewing shortly at http://chesapeake.usgs.gov/phase6/ also the beta version of the Chesapeake Bay Model can be viewed at this link.
Thanks to the hard work of groups like our own Prince William Soil and Water Conservation District and the Prince William County Government local land use data was collected from over 80% of counties. In parallel with these activities, the counties and municipalities funded the development of new high-resolution data on land cover—such as impervious surfaces, tree cover and water—for the entire watershed. This work was carried out by the Chesapeake Conservancy, the University of Vermont and World View Solutions, mapped out land cover across more than 80,000 square miles at a one-square-meter resolution. This is amazing resolution. This land cover data was then combined with the information provided by the various local governments and agencies to produce a detailed land use dataset for each county.
Now the EPA has used the high-resolution mapping of land use to update and improve the EPA Chesapeake Bay Program’s Chesapeake Bay Watershed Model, used to measure success against the EPA mandated Chesapeake Bay restoration activities and support local, state and regional decision making across the region. The latest version of this model, Phase 6, is currently under review. The EPA mandated a contamination limit for nutrient contamination and sediment to all the states in the Chesapeake Bay Watershed and Washington DC. The EPA set a total limit for the entire watershed of 185.9 million pounds of nitrogen, 12.5 million pounds of phosphorus and 6.45 billion pounds of sediment per year which was a 25% reduction in nitrogen, 24% reduction in phosphorus and 20 % reduction in sediment from the 2011 levels. The pollution limits were then partitioned to the various states and river basins and counties based on the Chesapeake Bay computer model and monitoring data. The problem with the first versions of the model was the land use data and impervious ground cover data was not consistent across different parts of the model.
The pollution limits were created by a series of models of the Chesapeake Bay Watershed. These computer models are mathematical representations of the real world that estimate environmental events and conditions. The models are at best imperfect, but they are nonetheless the best tool available to view the 80,000 square miles of the watershed. The Chesapeake Bay and its watershed are so large and complex, that scientists and regulators rely on computer models for critical information about the ecosystem’s characteristics and health, then use the model to assess the impact of various environmental mitigations to reduce pollution.
The earlier versions of the model had used approximately 675,917 acres for the impervious surface data and 1,885,915 acres for the pervious surface data in Virginia. A review of the EPA’s own data found that there were 1,569,377 impervious acres and 3,442,346 pervious acres in the urban areas in the Virginia segments of the model that includes all the paved and landscaped areas of suburbia. Between the 1990 census and the 2010 census when the model was developed the population of Virginia grew from 6.2 million people to 8.0 million people. The bulk of that growth took place in the urban and suburban centers of the Chesapeake Bay watershed. This served to distort the model and resulted in the under reporting of pollution from impervious surfaces (like roads, buildings and parking lots). Pollutions loads for nitrogen, phosphorus and sediment in the urban areas are calculated using a constant pounds/acre/year for impervious acres as a fixed input, and the pervious load is based on total fertilizer sales data less the impervious load. The under reported acres of housing and roadways distorted the agricultural contribution to the pollution by increasing it. Hopefully in the latest version of the model, that has now been corrected.
The datasets will be made available free-of-charge to local governments and the public over the next month or so. In addition, local governments will make available the data they collected on past land cover and land use over the last 30 years, as well as map overlays with geographic coverages of federal lands, sewer service areas, regulated stormwater areas and combined sewer overflow areas within each county. At the moment, Prince William and the rest of the Virginia counties are all still pending, but should be ready for viewing shortly at http://chesapeake.usgs.gov/phase6/ also the beta version of the Chesapeake Bay Model can be viewed at this link.
Thursday, November 3, 2016
WSSC Tries New Pipes
The Washington Suburban Sanitary Commission (WSSC) provides water and sewer service to 1.8 million residents in approximately 460,000 households and businesses in Prince George’s and Montgomery counties in Maryland. Established in 1918, WSSC is one of the largest water and wastewater utilities in the nation, with a network of about 5,600 miles of fresh water pipeline and over 5,400 miles of sewer pipeline. Unfortunately, over many years the maintenance and replacement of the piping systems was deferred.
Approximately 1,300 miles of the more than 5,600 water mains are more than 50 years old. Nearly 2,500 water mains are between 25 and 50 years old. Almost 1,800 miles of water main were installed in the last 25 years. The age of the piping reflects more the age of the buildout of the system than any maintenance and replacement, for decades the only pipe replacements were for failed piping. After decades of deferred maintenance water main breaks have grown in frequency to about 2,000 breaks a year. Though age is not the only factor that causes pipe failure, most of the system’s pipes were designed for an average lifespan of 70 years. Over the next 10 years WSSC projects they will have to replace over 2,000 miles of water pipe and similar amount or sewer pipes. WSSC estimates that water pipes cost about $1,600,000 per mile of pipe.
Now WSSC is introducing new pipes that feature a zinc-coating on the exterior, which protects against corrosion. This is an improvement on the previous industry standard of uncoated ductile iron pipes that are exposed to soil. The zinc coating increases the life of iron pipes. Charged zinc ions migrate to scratched areas naturally, so pipes are less susceptible to corrosion damage than those composed of other materials. Additionally, the new piping will also have a V-bio® enhanced polyethylene encasement wrap, which discourages the growth of harmful bacteria that can damage the pipe.
Traditional ductile iron pipe lasts about 50-75 years the new pipes have a projected lifespan of well over 100 years, based on experience in Europe. Zinc coating on ductile iron pipe has been in widespread use in Europe where the industry first began using zinc coatings in 1955. As a result of zinc's widespread use there, standards were both developed and widely adopted. The advances in zinc coatings over the past 60 years have resulted in a highly effective corrosion inhibiting pipe that is now also coated in a V-bio polywrap.
The wrap protects the iron from a fresh supply of oxygen, thus halting or greatly inhibiting the corrosion. The patented V-Bio prevents microbiological cells from forming that would attack the iron and deplete the zinc. It is believed that the use of V-Bio with the presence of zinc as an anode to iron will further slow the process, though field tests in the United States have so far been about 10 years. The new pipe costs slightly more than the older style pipes, but since installation is the bulk of the cost the total price per mile is estimated to increase to $1,620,000 from $1,600,000. A reasonable cost for what WSSC hopes will be decades of additional service.
Though recently, WSSC has been replacing about 55-60 miles of water mains per year that has not been enough to keep up with the aging system that suffers from decades of deferred maintenance and some problematic piping. That rate of pipe replacement would replace the water system in 101 years, but much of the system has already exceeded their design life and some pipes in the WSSC system have not been lasting as long as originally projected.
Most of WSSC was installed after World War II in the booms of the 20th and 21st centuries. Post-World War II pipes tend to have an average life in the real world of 50-105 years depending on many factors (AWWA). To extend the life of the ductile pipes they were mortar-lined. These linings were meant to prevent corrosion and increase pipe longevity. In the 1970’ steel reinforced concrete pipe with a promised life of 100 years began to be used for the giant water mains by WSSC. Unfortunately, these concrete trunk lines began to fail catastrophically decades before their promised 100-year life expectancy.
WSSC has 350 miles of steel reinforced concrete pipe. WSSC ‘s supplier, Interpace, may have produced inferior pipe- the company was successfully sued by WSSC and others and is now out of business. Nine of the WSSC’s concrete mains have blown apart since 1996. After a particularly spectacular blowout 2008 and to prevent future catastrophe, WSSC installed a sensor system along all the concrete mains that cost more than $21 million to alert WSSC of an impending failure, but unfortunately the replacement program became an emergency replacement program responding to sensors and smaller breaks. Now the new program hopes to improve this situation, but with 1,300 miles of piping over 50 years old it is likely that the number of water main breaks will get worse before it gets better. Remember most pipes break in the winter months, so be prepared for emergencies and store an adequate emergency supply of water in your home- 10 gallons per person should be a three day supply.
Approximately 1,300 miles of the more than 5,600 water mains are more than 50 years old. Nearly 2,500 water mains are between 25 and 50 years old. Almost 1,800 miles of water main were installed in the last 25 years. The age of the piping reflects more the age of the buildout of the system than any maintenance and replacement, for decades the only pipe replacements were for failed piping. After decades of deferred maintenance water main breaks have grown in frequency to about 2,000 breaks a year. Though age is not the only factor that causes pipe failure, most of the system’s pipes were designed for an average lifespan of 70 years. Over the next 10 years WSSC projects they will have to replace over 2,000 miles of water pipe and similar amount or sewer pipes. WSSC estimates that water pipes cost about $1,600,000 per mile of pipe.
Now WSSC is introducing new pipes that feature a zinc-coating on the exterior, which protects against corrosion. This is an improvement on the previous industry standard of uncoated ductile iron pipes that are exposed to soil. The zinc coating increases the life of iron pipes. Charged zinc ions migrate to scratched areas naturally, so pipes are less susceptible to corrosion damage than those composed of other materials. Additionally, the new piping will also have a V-bio® enhanced polyethylene encasement wrap, which discourages the growth of harmful bacteria that can damage the pipe.
Traditional ductile iron pipe lasts about 50-75 years the new pipes have a projected lifespan of well over 100 years, based on experience in Europe. Zinc coating on ductile iron pipe has been in widespread use in Europe where the industry first began using zinc coatings in 1955. As a result of zinc's widespread use there, standards were both developed and widely adopted. The advances in zinc coatings over the past 60 years have resulted in a highly effective corrosion inhibiting pipe that is now also coated in a V-bio polywrap.
The wrap protects the iron from a fresh supply of oxygen, thus halting or greatly inhibiting the corrosion. The patented V-Bio prevents microbiological cells from forming that would attack the iron and deplete the zinc. It is believed that the use of V-Bio with the presence of zinc as an anode to iron will further slow the process, though field tests in the United States have so far been about 10 years. The new pipe costs slightly more than the older style pipes, but since installation is the bulk of the cost the total price per mile is estimated to increase to $1,620,000 from $1,600,000. A reasonable cost for what WSSC hopes will be decades of additional service.
Though recently, WSSC has been replacing about 55-60 miles of water mains per year that has not been enough to keep up with the aging system that suffers from decades of deferred maintenance and some problematic piping. That rate of pipe replacement would replace the water system in 101 years, but much of the system has already exceeded their design life and some pipes in the WSSC system have not been lasting as long as originally projected.
Most of WSSC was installed after World War II in the booms of the 20th and 21st centuries. Post-World War II pipes tend to have an average life in the real world of 50-105 years depending on many factors (AWWA). To extend the life of the ductile pipes they were mortar-lined. These linings were meant to prevent corrosion and increase pipe longevity. In the 1970’ steel reinforced concrete pipe with a promised life of 100 years began to be used for the giant water mains by WSSC. Unfortunately, these concrete trunk lines began to fail catastrophically decades before their promised 100-year life expectancy.
WSSC has 350 miles of steel reinforced concrete pipe. WSSC ‘s supplier, Interpace, may have produced inferior pipe- the company was successfully sued by WSSC and others and is now out of business. Nine of the WSSC’s concrete mains have blown apart since 1996. After a particularly spectacular blowout 2008 and to prevent future catastrophe, WSSC installed a sensor system along all the concrete mains that cost more than $21 million to alert WSSC of an impending failure, but unfortunately the replacement program became an emergency replacement program responding to sensors and smaller breaks. Now the new program hopes to improve this situation, but with 1,300 miles of piping over 50 years old it is likely that the number of water main breaks will get worse before it gets better. Remember most pipes break in the winter months, so be prepared for emergencies and store an adequate emergency supply of water in your home- 10 gallons per person should be a three day supply.
Monday, October 31, 2016
Wind Generated Power 2016
Since the last time I looked at the share of total U.S. electricity generation by wind mills it has risen significantly. Wind facilities produced 190,927 gigawatt/hours (GWh) of electricity in 2015, or 4.7% of net U.S. electric power generation. The level of wind's generation has doubled since 2010, when the share was 2.3%. Based on monthly data through July, wind has provided 5.6% of U.S. generation in 2016.
The increase of wind power in the United States has been driven by a combination of technology and policy changes. Technological changes include improved wind technology and increased access to transmission capacity. Financial incentives and policies such as the Federal Production Tax Credit (PTC), Investment Tax Credit (ITC), and state-level renewable portfolio standards (RPS) have pushed utilities and investment groups to build more wind capacity. The PTC that grants a federal tax credit on wind generation, while the ITC allows federal tax credits on wind farm investments. State RPS, meanwhile, require that a minimum percentage of electricity generation comes from renewable energy.
The bulk electric system of the Lower 48 states consists of three independently electric interconnections: Eastern, Western, and the ERCOT part of Texas. Because of minimal transfers of electricity between the Interconnections, each interconnection essentially meets its demand with its own generating resources and utilizes the available generating power sources. Despite incentives, most of the wind generation takes place in the center of the nation where prevailing winds are strongest.
In 2015, 11 states generated at least 10% of their total electricity from wind. In 2010, only three states had at least a 10% wind share. Iowa had the largest wind generation share, at 31.3%, and South Dakota (25.5%) and Kansas (23.9%) had wind generation shares higher than 20%. Two additional states, Texas and New Mexico, are on track to surpass a 10% wind generation share in 2016, based on data through July. Wind generation in Texas, the highest wind electricity-producing state, made up 24% of the national total wind generation and 9.9% of Texas's total electricity generation in 2015.
Subscribe to:
Posts (Atom)