Sunday, August 29, 2021
Particulate Pollution Tied to Dementia
Using data from two large, long-running study projects in the Puget Sound region of Washington State— one that began in the late 1970s measuring air pollution and another on risk factors for dementia that began in 1994 — University of Washington researchers identified a link between air pollution and dementia.
In the University of Washington study, Rachel Shaffer, the lead author and whose PhD dissertation this was, found that a small increase in the levels of fine particle pollution (PM2.5 or particulate matter 2.5 micrometers or smaller) averaged over a decade at specific addresses in the Seattle area was associated with a greater risk of dementia for people living at those addresses.
“We found that an increase of 1 microgram per cubic meter of exposure corresponded to a 16% greater hazard of all-cause dementia. There was a similar association for Alzheimer’s-type dementia,” said now Dr. Shaffer, who conducted the research as a doctoral student in the UW Department of Environmental & Occupational Health Sciences.
Rachel Shaffer et al. used the Adult Changes in Thought (ACT) cohort study based in Seattle to examine associations between exposures to fine particulate matter with a diameter ≤2.5μm (PM2.5) and incident all-cause dementia. Once a patient with dementia was identified, the researchers compared the average pollution exposure of each participant leading up to the age at which the dementia patient was diagnosed. Exposure data was based on the home address of the individuals and not activity levels or locations. The researchers also had to account for the different years in which these individuals were enrolled in the study, since air pollution has dropped significantly since the 1970’s when the ACT study began.
In their final analysis, the researchers found that just a 1 microgram per cubic meter difference between residences was associated with 16% higher incidence of dementia. One microgram per cubic meter difference in PM2.5 pollution is extremely small and subtle.
“We know dementia develops over a long period of time. It takes years —even decades — for these pathologies to develop in the brain and so we needed to look at exposures that covered that extended period,” Shaffer said. And, because of long-running efforts by many UW faculty and others to build detailed databases of air pollution in our region, “we had the ability to estimate exposures for 40 years in this region. That is unprecedented in this research area and a unique aspect of our study.”
Although not the first to report on this subject, and this study proved only correlation not causation, Bete Ritz and Yu Yu in their invited commentary state: “this study makes an important contribution to the field, “not only due to its size and careful exposure and outcomes assessment, but especially because it suggests that the cognitive health of even a low-risk, low-exposure population may be affected. “
As the researchers state: “Air pollution exposure is ubiquitous globally. Strong evidence links air pollutants to cardiovascular events and diabetes, both known to affect cognition in elders. However, data in support of the contributions of air pollution to aging-related cognitive decline are only just emerging (Paul et al. 2019). As exposures are chronic and affect large populations, even modest risks result in large numbers of cases (Kuenzli 2002).” So, as the particulate pollution from western wildfires moves east, mask up.
Rachel M. Shaffer, Magali N. Blanco, Ge Li, Sara D. Adar, Marco Carone, Adam A. Szpiro, Joel D. Kaufman, Timothy V. Larson, Eric B. Larson, Paul K. Crane, and Lianne Sheppard; Fine Particulate Matter and Dementia Incidence in the AdultPublished:4 August 2021CID: 087001https://doi.org/10.1289/EHP9018
Wednesday, August 25, 2021
Hurricane Frequency Is Not Increasing
I was glued to the weather channel this past weekend to watch Hurricane Henri head for New England, but was relieved when Henri was downgraded to a tropical storm. I have read and heard the climate change is responsible for increasing hurricane intensity and frequency. Satellite data from the last 40-45 years is the source of that belief. Over the weekend I had plenty of time to read up on the latest research. What I found is that according to an article published recently in Nature Communications that may not be true.
In the article, “Changes in Atlantic major hurricanefrequency since the late-19th century,” that I have cited below, Dr. Gabriel
Vecchi a climate scientist from Princeton University, found that the frequency of hurricanes has not increased in the Atlantic
over the past 168 years. The trend in intensity was not examined, though the data brings into question if the intensity of storms has increased over the 168 year period.
The scientists found: “that recorded century-scale
increases in Atlantic hurricane and major hurricane frequency, and associated
decrease in USA hurricanes strike fraction, are consistent with changes in
observing practices and not likely a true climate trend.” The scientists
developed a method using probabilities and known storm tracks to adjust the old
observation data and found no change in hurricane intensity and frequency over
1851-2019. Instead, they found a decrease in hurricane frequency about 50 years
ago and a recovery.
Even though the North Atlantic (NA) basin is a minor
contributor to global cyclone storm frequency, Atlantic hurricanes have a well
documented long-term records of their track and frequency mostly because of the
damage the storms bring when they make landfall as Categories 3–5.
In homogenizing the historical data from 1851–2019 the
scientists discovered that the increases in basin-wide hurricane and major
hurricane activity since the 1970s was not a trend caused by climate change, but
a recovery from a deep minimum in the 1960s–1980s. This same recovery may have impacted the conclusions from other studies done on storm intensity in the 1979-2019 period.
The scientists postulated that air pollution containing
particulates blocked and scattered sunlight and induced reductions in major
hurricane frequency. They go on to say that this could have masked a century-scale
greenhouse-gas warming contributions to North Atlantic major hurricane
frequency. Meanwhile, it is not in their data.
The accepted climate models and theoretical arguments
indicate that in a warming world the hurricane peak intensity and
intensification rate should increase, along with frequency. So, climate models
currently increase the overall number of Categories (3, 4, or 5) hurricanes in
response to CO2 increases.
The scientists do not believe that their work provides
evidence against the hypothesis that greenhouse-gas-induced warming
may lead to an intensification of North Atlantic hurricanes. Kossin et al. in their work using satellite data for the period of 1979-2017 found a statistically significant upward trend in the intensity of hurricanes and tropical cyclones globally.
The new study may bring that finding into question, but it is argued that
substantial variability may obscure trends computed over the past century, and
pollutant-driven reduction in hurricane activity over the 1960s–1980s may have
obscured any greenhouse induced hurricane intensification over the 20th
century.
Other research has supported the idea that the number of
hurricanes has not increased, but there is a strong belief amongst climate
scientists that storm intensity is increasing, though there is no documentation
of that as yet. Data selection period and the difficulty of collecting accurate
data in the past has limited the ability to see if the trends exist clearly.
Dr. Vecchi and his research team are the go-to researchers on
climate impacts on hurricanes and extreme weather events. This is one of those
instances when time and further data collection will ultimately hold more answers
or at least information. In the meantime, remember that storms can be deadly,
rising sea level increases storm surge and flooding; and we should always take storm
warning seriously.
Vecchi, G.A., Landsea, C., Zhang, W. et
al. Changes in Atlantic major hurricane frequency since the late-19th
century. Nat Commun 12, 4054 (2021). https://doi.org/10.10 38/s41467-021-24268-5
Sunday, August 22, 2021
Colorado River
from Bureau of Reclamation |
- Lake Powell’s January 1, 2022, water elevation will be 3,535 feet - about 165 feet below full. Based on this projection, Lake Powell will release 7.48 million acre-feet in water year 2022.
- Lake Meade will operate in its Level 1 Shortage Condition for the first time ever. The required water reductions negotiated under the series of negotiated interim guidelines for Lower Basin Shortages and Coordinated Operations of Lake Powell and Lake Mead:
- Arizona must reduce water use by 512,000 acre-feet, which is approximately 18% of the state’s annual apportionment.
- Nevada must reduce water use by 21,000 acre-feet, which is 7% of the state’s annual apportionment.
- Mexico will receive 80,000 acre-feet less, which is approximately 5% of the country’s annual allotment.
- California as the most senior water right holder in the lower basin does not have a reduction under Level 1 shortage conditions.
More than twenty years of drought, drying out of the West due to changing climate, and growing populations throughout the basin are creating a water crisis. Under the terms of the Colorado Compact the Upper Basin States must deliver 7,500,000-acre feet of water each year for the Lower Basin States and 1,500,000-acre feet for Mexico, an allotment for the tribes and an allotment to nature. The Lower Basin states of Arizona, California and Nevada were first to address their growing problem by creating a Drought Contingency Plan to address California’s use of the original excess of the Lower Basin allotment.
The Upper Basin States of Colorado, New Mexico, Utah and Wyoming still have to agree to their own Drought Contingency Plan to prevent a “Compact Call” under the Colorado Compact which would simply cut water to users across the Upper Basin States proportionally if they are unable to deliver the water, they are required to send through Lake Powell under the Colorado Compact.
Today, the river provides water to 40 million people and 5.5 million acres of farmland in Colorado, Wyoming, Utah, New Mexico, Nevada, Arizona and California, 29 Native American tribes and the Mexican states of Sonora and Baja California. Even without climate change, paleoclimate records show a history of tremendous droughts in the region, and now more than 40 million people (in the upper and lower basins) depend upon the Colorado River’s waters for their water supply.
1922 Colorado River Compact, negotiated by the seven basin states (Colorado, Nevada, Utah, New Mexico, Wyoming, Arizona, California, ) divided the Colorado River basin into upper and lower portions, allotted the Colorado’s water on the basis of territory rather than prior appropriation. Before this agreement was negotiated allocation of water rights (ownership) was based on historic use, first to use the water owned it in perpetuity. In a land where water was wealth and all water was diverted from its natural location, this was how it was done. The allocation of water rights based on territory allowed development to proceed in the lower basin (essentially California) while safeguarding supplies for the upper basin. Then, as now, California's growth and demand for water was viewed with concern by her neighbors.
The problem now is that the allocations promised under the Colorado Compact was based on an expectation that the river's average flow was 16.4 million acre feet per year and ignored the needs of nature and the tribes. Subsequent studies: however, have concluded that the long-term average water flow of the Colorado is less. In addition, according to the University of Arizona, records going back to paleolithic times (more than 10,000 years ago) indicates periods of mega-droughts in the distant past.
Allotted shares of water in the basin to both the United States and Mexico exceeds the average long-term (1906 through 2018) historical natural flow of under 16.0 million acre-feet. To date, the imbalance has been managed, and demands largely met by slowly using up the considerable amount of reservoir storage capacity in the Colorado River system that once held approximately 60 million acre-feet (nearly 4 years of average natural flow of the river). It was assumed that drought years would be followed by wet year to refill the reservoirs. The basin is in its 22nd year of drought.
Wednesday, August 18, 2021
Child Dies in North Carolina
From a NCDHHS new release:
The North Carolina Department of Health and Human Services reports
that a child died Friday after developing an illness caused by an amoeba that
is naturally present in freshwater. The child became ill after swimming in a
private pond at their home in central North Carolina in early August.
Laboratory testing at the federal Centers for Disease
Control and Prevention confirmed the child’s illness was caused by Naegleria fowleri, an
amoeba (one-celled living organism) commonly found in freshwater. Naegleria
fowleri, does not cause illness if swallowed but can be fatal if forced up the
nose, as can occur during jumping into water, diving, water-skiing or other
water activities.
Symptoms of Naegleria fowleri infection
— an infection of the brain called primary amebic meningoencephalitis (PAM) —
start with severe headache, fever, nausea and vomiting and progress to stiff
neck, seizures and coma and can lead to death. These rare infections usually
occur when it is hot for prolonged periods of time, which results in higher
water temperatures and lower water levels. Naegleria fowleri grows best at
higher temperatures up to 115°F.
"Our heart-felt condolences and sympathies are with the family and friends of this child," said State Epidemiologist Zack Moore, M.D. "Although these infections are very rare, this is an important reminder that this amoeba is present in North Carolina and that there are actions people can take to reduce their risk of infection when swimming in the summer."
There is no means to eliminate this amoeba from fresh bodies of water, in warmer areas where this infection has been more
common, recommended precautions include:
- Limit the amount of water going up your nose. Hold your nose shut, use nose clips or keep your head above water when taking part in warm freshwater-related activities.
- Avoid water-related activities in warm freshwater during periods of high water temperature and low water levels.
- Avoid digging in or stirring up the sediment while taking part in water-related activities in shallow, warm freshwater areas.
Naegleria fowleri infections are rare, with only 147 known
infections in the U.S. from
1962 through 2019. North Carolina had six cases during that time period. This
amoeba can cause severe illness up to nine days after exposure. A person cannot
be infected with Naegleria fowleri by drinking water, the amoeba is not found
in salt water, or in properly maintained and chlorinated pools.
With the climate warming and heat waves increasing we in
Virginia should be aware of this risk to protect our children.
Sunday, August 15, 2021
Prince William Service Authority
Utilizing current projections of water usage, the County and the Metro-Washington Council of Governments estimates that the current level of water capacity is sufficient until at least 2040. This does not take into consideration any future development of the rural crescent which has not been factored into the current projection. Depending on the level of development in the rural crescent, this date will likely occur much sooner. In addition, these projections, also, do not take into account any increase in drought due to climate change.
The Prince William Service Authority distributes about 11 billion gallons of water per year to the residents of the county that are on public water. The communities that receive their water from the Service Authority can be seen below in the blue, pink and yellow areas. The remainder of the county obtains its water from private wells.
The maximum amount of water that can be delivered by the Service Authority is 67.8 million gallons a day (MGD). The source of that water is:
- Fairfax Water: 62.4 MGD
- Lake Manassas: 5.0 MGD
- Service Authority wells: 0.4 MGD
In fiscal year 2021 the average daily water demand was 30.0 MGD with the maximum day at 49.3 MGD. As you can see 92% of the water capacity for the county comes from Fairfax Water which in turn draws water from both the Potomac River and the Occoquan Reservoir. At present, less than 1% of the public water supply comes from public groundwater wells. There was a time (40 years ago) when most of the water in Prince William came from public supply wells, but that came to an end when the public wells in Manassas were discovered to be contaminated with solvents. The only groundwater study of Princes William County was done to identify the extent of the solvent contamination in 1997.
Based on growth projected that were made using the previous comprehensive plan that did not allow any public water use in the Rural Crescent or consider any change increase in duration and severity of droughts due to climate change, exiting capacity should be sufficient until 2040. However, the climate is changing and Prince William County is developing within the Rural Crescent.
Wednesday, August 11, 2021
IPCC Climate Report
On Monday after a yearlong delay due to covid, the U.N. Intergovernmental Panel on Climate Change(IPCC) released the first report on Climate Change since 2013. The report, issued by the IPCC’s Working Group I and approved by 195 member governments, is the first in a series leading up to the 2022 IPCC Sixth Assessment Report and serves as a motivational leadoff to the 26th UN Climate Change Conference of the Parties (COP26) in Glasgow on October 31, 2021-November 12, 2021.
Since the IPCC report is based on previously published peer
reviewed work there were no surprises. (I will admit that I only read the Summary
for Policymakers and sections of the discussion on the climate models in key
findings.) Climate change is widespread and intensifying. The concentration of
greenhouse gases in the atmosphere is at its highest level since the dawn of
mankind. Before the COVID-19 pandemic when the data set used stops, emissions
of carbon dioxide had been rising by about 1% per year on average for the past
decade. Renewable energy use has been expanding rapidly, but much of the
renewable energy is being deployed alongside existing fossil energy, not
replacing it. All the climate models tie the rise in temperature to
concentrations of atmospheric carbon dioxide. The planet has warmed 1.1 degrees
C since the late 19th century and is expected to warm an additional 0.4 degrees
C in the next 20 years.
“At 1.5°C global warming, heavy precipitation and associated
flooding are projected to intensify and be more frequent in most regions in
Africa and Asia (high confidence), North America (medium to high confidence) and
Europe (medium confidence). Also, more frequent and/or severe agricultural and
ecological droughts are projected in a few regions in all continents except
Asia.”
From IPCC |
Mankind in their burning of fossil fuels and covering the earth with concrete is responsible for this rise in temperature. No actions that nations can likely agree to take at November’s COP26 meeting can change this trajectory we only have some hope of moderating it. That is the key finding of the latest scientific report from the IPCC. Scientists use the five projection scenarios to demonstrate the difference that our coordinated action can make, but also warn: “The magnitude of feedbacks between climate change and the carbon cycle becomes larger but also more uncertain in high CO2 emissions scenarios (very high confidence). However, climate model projections show that the uncertainties in atmospheric CO2 concentrations by 2100 are dominated by the differences between emissions scenarios (high confidence). Additional ecosystem responses to warming not yet fully included in climate models, such as CO2 and CH4 fluxes from wetlands, permafrost thaw and wildfires, would further increase concentrations of these gases in the atmosphere (high confidence).”
The IPCC report finds that changes in the Earth’s climate in
every region and across the whole climate system have occurred and are resulting
in increase in extreme weather events. “In 2019, atmospheric CO2 concentrations
were higher than at any time in at least 2 million years (high confidence), and
concentrations of CH4 and N2O were higher than at any time in at least 800,000
years (very high confidence).” “Global surface temperature has increased faster
since 1970 than in any other 50-year period over at least the last 2000 years
(high confidence).”
Recent work had shown an oversensitivity of the climate models
to changes in CO2 levels; however, the actual warming of the earth has gone on
long enough to identify the inconsistency and the to constrain their
projections somewhat. Nonetheless, the report finds the forecast warmer climate
will intensify weather events and the resulting flooding or droughts; but the
location and frequency of these events depend on projected changes in regional
atmospheric circulation, including monsoons and mid-latitude storm tracks and
cannot be projected at this time.
It is virtually certain that global mean sea level will
continue to rise over the 21st century. Long term “sea level is committed to
rise for centuries to millennia due to continuing deep ocean warming and ice
sheet melt, and will remain elevated for thousands of years (high confidence).”
“Limiting human-induced global warming to a specific level
requires limiting cumulative CO2 emissions, reaching at least net zero CO2
emissions, along with strong reductions in other greenhouse gas emissions.
Strong, rapid and sustained reductions in CH4 emissions would also limit the
warming effect resulting from declining aerosol pollution and would improve air
quality.” There are the marching orders for COP26 in Glasgow this fall.
Good luck to us all.
Sunday, August 8, 2021
Ethanol in Fuel
I have never been a fan of adding ethanol to fuel, but that may be changing. MIT researchers have found a way to achieve high yields of ethanol with different types of cellulosic feedstocks. Most ethanol in the U.S. is made from corn, not cellulose waste and though corn ethanol is technically a “renewable” energy source it has a large environmental footprint. Currently, around 40 % of the U.S. corn harvest goes into ethanol. Corn that could be used to feed people is instead used to make ethanol. This uses cropland, pesticides, fertilizer, and water to produce the corn just for that.
The Renewable Fuel Standard (RFS) is a federal program that
requires fuel sold in the United States to contain a minimum volume of
renewable fuels. The RFS required renewable fuel to be blended into transportation
fuel in increasing amounts each year, escalating to 36 billion gallons by 2022.
The RFS program only looks at greenhouse
gases (GHGs) relative to the petroleum fuel it replaces, the program does not
look at the overall environmental costs of the “renewable” fuel. It was
calculated that the RFS increased the need for cropland by 23% because ethanol
production is limited in large part by its reliance on corn to produce ethanol.
from LuoyeChen et al 2021 Environ. Res. |
According to Tyler Cowen, professor of Economics at George Mason University, in his book, An Economist Gets Lunch, New Rules for Everyday Foodies, “(To put ethanol into gasoline) costs a lot more money than does traditional gasoline, once the cost of the subsidy is included. Sadly, it does not even make the environment a cleaner place. The energy expended in growing and processing the corn is an environmental cost too…the nitrogen-based fertilizers used for the corn are major polluters. Ethanol subsidies are a lose-lose policy on almost every front, except for corn farmers and some politicians.” “For millions of (people in poor countries) it is literally a matter of life and death and yet we proceed with ethanol for no good reason…(Biofuels) has thrown millions of people around the world back into food poverty.”
According to a recently published article “The economic andenvironmental costs and benefits of the renewable fuel standard” LuoyeChen et al 2021 Environ. Res. Lett. 16 034021, maintaining
the corn ethanol mandate at 15 billion gallons until 2030 will lead to a
discounted cumulative value of an economic cost of $199 billion over the
2016–2030 period. Their cost estimate includes $109 billion of economic costs
and $85 billion of net monetized environmental damages, however; they do not
account for the cost of water resources. The additional implementation of the
cellulosic biofuel mandate for 16 billion gallons by 2030 increases the
economic cost by $69 billion which they find will be partly offset by the net
discounted monetized value of environmental benefits of $20 billion, resulting
in a net additional cost of $49 billion over the 2016–2030 period.
Currently, feedstocks such as straw and woody plants which
are wastes are difficult to use for biofuel production because they first
need to be broken down to fermentable sugars, a process that releases numerous
byproducts that are toxic to yeast, the microbes most commonly used to produce
biofuels. Yet, there are more than a billion tons of cellulosic biomas
including switchgrass, wheat straw, and corn stover (what is left in the fields
after the corn is harvested). According to a U.S. Department of Energy study
this is enough biomass to replace 30%-50% of the petroleum used for
transportation if the cellulosic biomass could be cleanly and efficiently turned
into ethanol.
Now, MIT researchers have found a way to achieve high yields
of ethanol with different types of cellulosic feedstocks, including
switchgrass, wheat straw, and corn stover. From MIT News: “The MIT team built
on a technique they
had developed several years ago to improve yeast cells’ tolerance to a wide
range of alcohols. In their new study, the researchers engineered yeast so that
they could convert the cellulosic byproduct aldehydes into alcohols, allowing
them to take advantage of the alcohol tolerance strategy they had already
developed. They tested several naturally occurring enzymes that perform this
reaction, from several species of yeast, and identified one that worked the
best. Then, the scientists used directed
evolution to further improve it.”
Yeast are generally not very efficient at producing ethanol
from toxic cellulosic feedstocks; however, when the researchers used their
improved enzyme and spiked the reactor with the membrane-strengthening
additives, the strain more than tripled its cellulosic ethanol production, to
levels matching traditional corn ethanol. “What we really want to do is open
cellulose feedstocks to almost any product and take advantage of the sheer
abundance that cellulose offers,” says Felix Lam, an MIT research associate and
the lead author of the new study.
Gregory Stephanopoulos, the Willard Henry Dow Professor in
Chemical Engineering, and Gerald Fink, the Sokol Professor
at the Whitehead Institute of Biomedical Research and the American Cancer
Society Professor of Genetics in MIT’s Department of Biology, are the senior
authors of the paper, which appeared open access in Science Advances.
Though the President has gone all in on electric vehicles looking
to transition to half the vehicles sold in the United States to be
electric vehicles by 2030, that goal would require more electricity, a re-imagined and
modernized grid, charging stations and still there would remain a tremendous
number of gas-powered vehicles. The average age of a car in the United States
is almost 12 years. So, even in the world where half of all cars
sold in the United States are electric, there would still be fuel burning
vehicles for decades to come. Ethanol is a renewable, domestically produced
transportation fuel. Whether used in low-level blends, such as E10 (10%
ethanol, 90% gasoline), E15 (10.5% to 15% ethanol), or E85 (flex fuel cars) a
gasoline-ethanol blend containing 51% to 83% ethanol, depending on geography
and season ethanol blends can reduce emissions and be part of the fuel lineup
for the future.
Wednesday, August 4, 2021
Special Session of the VA Legislature
Within hours of returning to Richmond for a special session, the General Assembly budget committees adopted the Governor’s plan to spend $3.5 billion in Federal Covid-19 emergency aid. Below are the plans for investment in water and water infrastructure from a Virginia Press Release last week.
According
to that plan Governor Ralph Northam announced that Virginia plans to allocate
$411.5 million of the $4.3 Billion that Virginia received in federal American
Rescue Plan (ARP) funding to reduce water pollution and increase access to
clean water across the Commonwealth.
The
proposal includes $186.5 million for improving wastewater treatment and
nutrient removal at wastewater treatment plants, $125 million to reduce
combined sewer overflows funding projects in Richmond, Alexandria, and
Lynchburg, and $100 million to assist water systems in small and disadvantaged
communities.
These
announced investments in water infrastructure are in addition to the more than $300
million in ARP funding that the Commonwealth sent to towns in June and $2.3
billion made available to Virginia’s 133 counties and cities directly from the
federal government to meet local response and recovery needs, which include
improving access to clean drinking water and to supporting vital wastewater and
stormwater infrastructure. These projects are being funded by the national debt
instead of the local rate payers needing to fund maintenance and improvement of
their utilities. Virginia long ago abandoned "pay as you go."
“Protecting
the environment, and particularly providing for sanitary disposal of
wastewater, is critical to public health and the economy,” said Secretary
of Natural and Historic Resources Matthew J. Strickler. “These investments
will put us even closer to restoring the Chesapeake Bay, and will clean up
streams and improve septic and sewer systems across the Commonwealth.” Rural and semi-rural residents should look for funding to septic upgrades becoming available.
Sunday, August 1, 2021
Occoquan Reservoir, the ICPRB and Your Water
From the Interstate Commission for the Potomac River Basin (ICPRB):
Despite not experiencing drought conditions locally, “Continued
low flows in the Potomac River have triggered daily
drought monitoring operations by ICPRB’s Section
for Cooperative Water Supply Operations on the Potomac .”
During daily drought monitoring protocols, the ICPRB
collects river flow, precipitation data and forecasts, and usage data and
forecasts from metropolitan area water suppliers....If conditions warrant active management of water
supplies and a potential release of stored water to meet demands.
This protocol was last used by ICPRB in September and August 2019, and before that in 2017. The dry conditions are affecting some smaller water systems in the basin. Front Royal, Va., has issued a call for voluntary water conservation because of low flows..,” despite adequate rainfall according to the Drought Monitor.
The Washington, DC, metropolitan area (WMA) is home to almost six million residents and workers. The region’s water suppliers have an important responsibility beyond supplying the needs or the residents: to provide 24/7 water that ensures the federal government, including Congress, the Pentagon, and key agencies can function. The water suppliers share the Potomac River as the major regional water resource, and so 40 years ago and came together to form the Interstate Commission on the Potomac River Basin (ICPRB) and a cooperative agreement of funding and using the water resources available regionally.
The Potomac River flow fluctuates with season and weather.
The ICPRB helps manage the river’s water resources. The cooperative agreement was
created, and the Jennings Randolph Reservoir was built to manage the use of the
Potomac River and to ensure that there is enough flow for essential services
like wastewater assimilation and habitat maintenance. The ICPRB monitors river
flows and water withdrawals to ensure the 100 million gallons per day minimum
flow at Little Falls.
That minimum flow level has been maintained since the early
1980's, but during times of drought, natural flows on the Potomac are not
always sufficient to allow water withdrawals by the utilities while still
maintaining the minimum flow in the river. When necessary, the ICPRB allocates
and manages water resources of the river using the jointly owned Jennings
Randolph Reservoir, Potomac River Low Flow Allocation Agreement and the Water
Supply Coordination Agreement. The reservoir and agreements were part of a
water management scheme developed by scientists at Johns Hopkins University.
For decades they have been used to jointly improve reliability of the water
supply.
The tools available to the ICPRB to manage water use are to
have members utilize their in-system storage or the shared system storage and
reduce their water withdrawals. Fairfax Water which supplies over 85% of Prince
William Service Authority water supply has a reservoir on the Occoquan River
that is outside the freshwater drainage area and is supplied by the Occoquan
River and recycled wastewater from UOSA (Upper Occoquan Service Authority wastewater
treatment plant). So, when necessary, the ICPRB requires Fairfax Water to reduce the water they draw from the Potomac and increase the water drawn from the Occoquan Reservoir.
The reservoir’s current storage capacity is estimated at 8.3 billion gallons. Water from the Occoquan Reservoir can only supply the Griffith
treatment plant which predominately serves the customers in the eastern portion
of Fairfax Water’s service area and the Eastern Distribution System of Prince
William County. However, Fairfax Water has a connector that can transfer water
from the Griffith plant to the western portion of its service area (and the
Prince William Western Distribution Area) normally supplied by the Corbalis plant
using water drawn from the Potomac River.
Two thirds of the Occoquan Watershed that supplies the
Occoquan Reservoir is in Prince William County. On November 17, 2020, the Prince
William County Board of Supervisors issued Directive No. 20-86 for county staff
to develop a protection overlay district for the Occoquan Reservoir. So far
county staff have reviewed a recent report prepared by Virginia Tech and their
Occoquan Watershed Monitoring Lab of the Occoquan Watershed and the Reservoir
System water quality. The county staff has discussed the report-findings with the
Northern Virginia Regional Commission. Staff is also reviewing reports and
recommendations from local committees and environmental groups and evaluating
current design standards and development practices, in relation to water
quality trends in the Reservoir.
Staff is expected to recommend a zoning text amendment for
an overlay district and/or the process to revise the Design and Construction
Standards Manual to provide increased protection for water quality sometime in
the future. An overlay district is used to put special restrictions land use or
grant special rights to some land. An overlay district could be used to limit
the types and amount of development on land within the watershed to protect the
Occoquan Reservoir, it could also be ineffective if too loose or constantly overridden
by the Board of Supervisors. The problem is not that Prince William County holds
about two thirds of the Occoquan Watershed; but that the Occoquan Watershed is
more than two thirds of Prince William County. To properly protect the Occoquan
Watershed and the regional water supply, the use of the remaining open land
must be severely restricted. The rest of the region needs to pay landowners for
the protection of the Occoquan Watershed.
This needs to happen now. Sometime in the future may be too late to protect this essential portion of our water supply. Recently, the Prince William County Board of Supervisors approved the development of the Preserve at Long Branch, rezoning a portion of the Rural Crescent adjacent to the Occoquan River. Also approved this spring was the Independent Hill Small Area plan. No analysis was done as to the potential impact of these developments to the hydrology of the Occoquan Watershed. There is no understanding what the impact this might have to the quality of and supply to the Occoquan Reservoir. Yet the Occoquan Reservoir is irreplaceable for the region.
Other threats to the watershed are under consideration by the Board of County Supervisors. The revival of the Bi-County parkway, this
time called the Va. 234 Bypass and the proposal from Maryanne Gahaban and Page
Snyder. The two Rural Area large landowners are pushing a proposal to convert
almost 800 acres of agriculture zoned land (in which they each have significant
ownership) to industrial data centers. Once more no analysis was done as to the
potential impact of these developments to the hydrology of the Occoquan
Watershed. There is no understanding what the impact this might have
to the quality of and supply to the Occoquan Reservoir.
During development the primary impact is erosion and
sediment that are carried by stormwater into the streams. Post-development the
primary impact is increased stormwater volume and velocity that is caused by
the removal of tree canopy cover and the replacement of pervious surfaces of
plants and grass with the impervious surfaces such as roads, parking lots,
rooftops, driveways, patios, etc.
Development increases impervious surface area, and this has
created in the past and will in the future create a host of concerns for
managing the Occoquan Watershed. For instance, the physical condition of the
Watershed's tributaries has been measured to fall with development. Increased
stormwater runoff from impervious surfaces flows into streams and creeks at a
higher volume and velocity. The result is increased erosion of stream banks
that leaves a degraded ecosystem.
Development impacts water quality. Minimizing impervious
surface cover and maintaining the tree canopy is critical to the protection of
the County’s streams which flow to the Occoquan and other reservoirs. There is
a direct correlation between stream health and impervious surface cover and
tree canopy. According to the Northern Virginia Regional Commission, watersheds
with impervious surface cover of 10 to 15% show clear signs of degradation,
while watersheds with impervious surface cover greater than 15-25% typically do
not support a diverse stream ecology and are dying.