Wednesday, September 20, 2023

Dead Zone, Late Season Update


The “Dead Zone” of the Chesapeake Bay refers to a volume of hypoxic water that is characterized by dissolved oxygen concentrations less than 2 mg/L, which is too low for aquatic organisms such as fish and blue crabs to thrive. Within the hypoxic area life of the bay dies and a “Dead Zone” forms. The Chesapeake Bay experiences hypoxic conditions every year, with the severity varying from year to year, depending on nutrient and freshwater flows into the bay, wind, and temperature.

In June researchers from the Chesapeake Bay Program, the University of Maryland Center for Environmental Science, University of Michigan and U.S. Geological Survey announced that they were predicting the 2023 dead zone would be the smallest dead zone on record since 1985.

This week researchers at the Virginia Institute of Marine Science (VIMS) reported that hypoxia in summer 2023 began in mid-May, increased to a moderate level, and then leveled off. They believe that this leveling off resulted from a change in the average wind direction in late-May. Hypoxia has remained at low to moderate levels throughout June, July, and into August. The spring-time nutrient supply to the Bay was relatively low and June was relatively windy, both of which may have contributed to June through August having a low amount of hypoxia. 2023 is turning out to be low hypoxia year for the Bay, we’ll know sometime over the next few months if it turned out to be the lowest on record.

from VIMS

Each year the Maryland Department of Natural Resources measures the actual dissolved oxygen in the Maryland portion of the Chesapeake Bay main stem and the size of the Dead Zone. While the Virginia Institute of Marine Science (VIMS), Anchor QEA and collaborators at UMCES, operate a real-time three-dimensional hypoxia forecast model using measured inputs that predicts daily dissolved oxygen concentrations throughout the Bay (www.vims.edu/hypoxia) using the National Weather Service wind monitoring data.

Sunday, September 17, 2023

UK keeps Nutrient Neutrality

Nutrient neutrality rules were first introduced in the European Union (EU) in 2017. These rules were designed to stop developers from polluting local wetlands and waterways in protected areas when building homes. When Great Britain left the EU they retained those environmental protection rules to prevent excessive nutrient pollution of vulnerable waterways and wetlands.

Excessive amounts of nutrients can lead to a “Dead Zone” which refers to a volume of hypoxic water that is characterized by dissolved oxygen concentrations less than 2 mg/L, which is too low for aquatic organisms to thrive. Within the hypoxic area life of the waterway dies and a “Dead Zone” forms. They occur most often in estuaries and coastal waters, but also in inland lakes, rivers, and streams.

 Increasingly water bodies across the globe are experiencing hypoxic conditions every year, with the severity varying from year to year, depending on nutrient and freshwater flows into the water from agriculture, suburban and urban runoff and wastewater treatment/ release, wind, and temperature. Better nutrient control  in wastewater was able to eliminate the Dead Zone in the Thames River.

The current “nutrient neutrality” means that 62 local authorities in the UK cannot allow new developments unless projects in protected areas can be shown to be "nutrient neutral" – not releasing anymore nutrients after the development than before. In practice the rules ensure that any increase in nutrient pollution from stormwater runoff is offset by a reduction in pollution in the same area. This increased the costs for home builders in the UK. The current UK Tory government has been under pressure to increase the country's housing stock, after warnings earlier this year that residential construction could fall to its lowest level since World War II.

So, the Government introduced an amendment in the House of Lords - to the Levelling Up and Regeneration Bill that would have seen the policy removed. The attempt was rejected by the House of Lords members, the peers, over the risk it would pose to the environment. The “nutrient neutrality” rule stands and the government will have to introduce a new bill to try again.

Closer to home, we have the Chesapeake Bay Clean Water Blueprint” the newer and better name for the enforceable pollution limits for nitrogen, phosphorus, and sediment pollution in the Chesapeake Bay (formerly called the Bay TMDL) mandated by the EPA to Virginia and the other five Bay states and the District of Columbia. Each of the jurisdictions created a plan (approved by the EPA) called Watershed Implementation Plans or WIPs, to meet the required reductions in nutrient pollution by 2025. The states agreed to have the 60% of the needed programs and practices in place by 2017, and to complete the job by 2025.

Virginia remains on track to achieve its 2025 pollution-reduction commitments, largely due to aggressive action the Commonwealth took on wastewater treatment plants. Those actions account for over 90% percent the nitrogen and phosphorus reductions since the Blueprint’s establishment. This progress currently keeps Virginia on track overall, even though the Commonwealth is not meeting commitments to reduce polluted runoff from agriculture and urban and suburban areas.

In June researchers from the Chesapeake Bay Program, the University of Maryland Center for Environmental Science, University of Michigan and U.S. Geological Survey announced that they are predicting this year's dead zone will 33% smaller than the historic average (from 1985-2022), which would be the smallest dead zone on record if the forecast proves accurate. This was a feel good moment for the Chesapeake Bay. 

However, the significantly smaller than average size is forecast due in large part to a lack of rainfall and mild drought this past spring. Less rainfall means lower flows of the rivers, but also generally means there is a lower amount of nutrients being washed off the land and into the water. The apparent progress may be temporary. Pollution from stormwater continues to grow with population and increased intensity of rainstorms could end up undermining all our other efforts. 

Wednesday, September 13, 2023

Townhall for PW Climate Plan

A virtual Town Hall on the Draft Community Energy and Sustainability Master Plan (CESMP) will be held at 6:30 p.m. Thursday, September 14, 2023. Sign up here. Prince William Office of Sustainability is seeking feedback from the public on the draft CESMP. The CESMP is being developed to serve as a roadmap for meeting the county’s Climate Mitigation and Resiliency goals to reduce Prince William's greenhouse gas emissions, and increasing resilience to the effects of climate change. These goals include Prince William County achieving 50% of 2005 CO2 emissions by 2030 and net-zero by 2050; but also include plans for adaption to climate change.

I am a member of the Sustainability Commission and ask that you participate in the process and help us build a sustainable future for our children. I am concerned that the draft CESMP relies too heavily on unrealistic and expensive assumptions and the purchasing of carbon credits to meet the climate goals. At the close of the COP-27 meeting the UN Secretary-General Guterres decried and bringing integrity to net-zero commitments by industry, financial institutions, cities and regions and to support a global, equitable transition to a sustainable future.

“Using bogus ‘net-zero’ pledges to cover up massive fossil fuel expansion is reprehensible. It is rank deception. This toxic cover-up could push our world over the climate cliff. The sham must end.” Mr. Guterres said that net-zero pledges should be accompanied by a plan for how the transition is being made. Prince William’s climate goals could be moderated to what is possible for us to achieve. There is hope that Prince William will be able to “bend the curve” if smart decisions are made in the future.

Though the PW Board of County Supervisors have adopted the net-zero pledge of the MWCOG, the decisions they have made within their nexus of control, land use and roads have worked against that resolution.  In fact, decisions made within our county and Loudoun will challenge the goals of the Virginia Clean Economy Act (VCEA). In 2020, the General Assembly passed the Virginia Clean Economy Act (VCEA), which mandated a goal of 100% zero-carbon energy generation by 2050. This actually was to provide the major tool in achieving the County goals. Afterall, if all power was electric and it was carbon free, no problem in achieving net zero.

However, the VCEA is facing challenges that may prevent it from achieving its goals in the stated time frame. According to VA Energy VCEA requires the Commonwealth to retire its natural gas power plants by 2045 (Dominion) and 2050 (Appalachian Power). These facilities currently comprise 67% of the current baseload generation as well as 100% of the power plants that meet peak demand.  In May when Dominion Energy filed its 2023 Integrated Resource Plan (IRP) with the State Corporation Commission (SCC) Dominion’s  carbon emissions would instead increase  from current levels. In the IRP submitted to the SCC Dominion forecasted that power demand would rise 80% and that peak load will rise from a bit more than 17,000 megawatts now to 27,000 megawatts by 2037. You cannot plan that amount of electricity demand growth 10 years while eliminating generation capacity. It has never been done, and Dominion admits that they need to not only keep all their fossil fuel power generation operating, but  are asking to build more dispatchable fossil fuel generation to meet this forecast demand. Now, the SCC is examining and challenging the growth assumptions that went into that forecast. It remains to be seen what will actually happen.

It appears that electrifying the transportation sector also faces hurdles in cost and accessibility for all along with challenges to electrifying all heating systems county wide. Building more roads and more housing away from transportation hubs will also increase the greenhouse gas footprint of the county.  Beyond the climate resolution, most decisions of the PW BOCS has contributed to the growth in power demand. We need to change directions and bend the curve down.  

Community Energy and Sustainability Master Plan (pwcva.gov)


Sunday, September 10, 2023

Prince William’s Perennial Streams Drying Up an Ominous Sign

This has been a dry water year (October 1- September 30)- the first one in a decade. The average rainfall in the Potomac River Basin for August was 0.8 inches below normal, but 2.25 inches below normal here in Haymarket. The cumulative rain deficit for the Potomac Basin was about 7.1 inches, but until Friday's deluge here in Haymarket the year-to-date deficit was 11.8 inches. Stream flow across the basin remains below average, and groundwater monitoring indicates below-normal levels.

The Potomac River, its tributaries, reservoirs and the associated groundwater resources are the source of drinking water for the over 6,000,000 people in the Washington Metropolitan area. The Interstate Commission on the Potomac River Basin (ICPRB) coordinates water supply/withdrawal operations for the Potomac River during times of drought and recommends releases of stored water from the jointly owned reservoirs. This is to ensure adequate water supplies for the large Washington metropolitan area water companies and for environmental flow levels.
from Drought Monitor 9/7/2023

Much of the Potomac watershed is currently in D1 drought (beige) according to the U.S. Drought Monitor, though more rain could change this in the usually wet fall. However, current conditions have triggered the Metropolitan Washington Council of Governments (MWCOG) Drought Coordination Technical Committee to consider initiating a “Drought Watch” stage. For now, the Potomac River’s flows are adequate to meet the water demands of the Washington metropolitan area without requiring releases from upstream reservoirs. However, the groundwater, an essential part of our water supply, remains an unknown. Groundwater has very little monitoring and management, but there have been some troubling observations recently.

Round Hill and Purcellville, Virginia whose town water supplies come from a series of wells have in recent days issued water conservation notices to utility customers as dry conditions continued to persist in the region. Round Hill Town staff were concerned that a creek near one of the wells has dried up and that Catoctin Creek in Purcellville had also run dry. Two of the town wells have been pulled off-line in Purcellville to allow them to recharge.

In Haymarket, there were also signs of concern. These pictures were sent to me from the Bull Run Mountain Conservancy last Friday morning. They showed that the perennial streams: Little Bull Run and Catlett’s Branch were dry. Catharpin Creek, another perennial stream, appeared to have been reduced to a series of puddles. This was the driest the Conservancy had seen the streams, ever. This Bull Run watershed is an essential part of the Occoquan watershed that directly supplies water to 800,000 people in Northern Virginia and allows the ICPRB to “ask” Fairfax Water to draw less from the Potomac River in times of need.
Catharpin Creek from BRM Conservancy, M. Kieffer

Catlett's Branch from BRM Conservancy, M. Kieffer

Little Bull Run from BRM Conservancy, M. Kieffer

Generally, groundwater in the Culpeper Basin is renewed each year through precipitation. The water stored in the watershed has always been able to provide adequate water in droughts because historically the withdrawal of water was within the average recharge rate. However,  the only nearby US Geological Survey groundwater monitoring well is no longer stable. The water level has been slowly falling since before the last drought- despite a series of wet years.

USGS Groundwater Monitoring Well 49V1

Properly managed and protected groundwater can be extracted indefinitely and still serve its ecological function. Groundwater recharge through precipitation requires adequate area for infiltration; control of sheet flow created by roads and paved areas, as well as protecting the most geologically favorable infiltration points. In a natural environment much of the precipitation soaks into the ground. Some water infiltrates deep into the ground and replenishes aquifers, which store huge amounts of freshwater for long periods of time. Some infiltration stays close to the land surface and can seep back into rivers, creeks, and ponds through the hyporheic zone.

A stream is a living ecosystem. It includes not just the water flowing between the banks but the earth, life and water around and under it. Beneath a living streambed is a layer of wet sediment, small stones and tiny living creatures called the hyporheic zone. Stream water filters down into this dynamic layer between surface water and groundwater, mixing with the groundwater pushing up to feed the rivers during dry spells. Water in the hyporheic zone cannot push up the groundwater if the groundwater level has fallen too low. The stream becomes disconnected from the groundwater. The level of groundwater falls usually due to overuse and reduced recharge.

Maintaining open areas provides areas of groundwater recharge. According to the U.S. Environmental Protection Agency, impervious cover levels of less than 5%-10% can significantly impact watershed health increasing stormwater runoff and reducing groundwater recharge. When runoff volume increases, runoff velocity increases, and peak storm flows increase and you get flooding with soil erosion, fast moving stormwater carrying contamination and reduced or eliminated water infiltration into groundwater. The groundwater is essential as the base flow to the streams and rivers that feed the Occoquan Reservoir during the dry months. Is this a little hint of the beginning of the end.

This weekend's rain might be some small relief in the region and the long-term forecasts call for 1 to 2 inches of rain from the remnants of hurricane Lee in the coming days. The trajectory of Hurricane Lee is still up the air, but it could bring some significant rain as it heads our way. Nonetheless, bad land use decisions and poor management of our water resources is magnifying precipitation changes due to a changing climate.

 

Wednesday, September 6, 2023

Well Owners are Responsible for their Water

While the U.S. Environmental Protection Agency (EPA) regulates public water systems, the responsibility for ensuring the safety and consistent supply of water for the more than 21 million private wells belongs to the well owner. These responsibilities should include knowing the land and well’s history, testing the water quality annually, and having any well system repairs performed by a well driller licensed or certified by the appropriate state agency where the well is located. In Virginia that is the Department of Professional and Occupational Regulation, DPOR.

Groundwater is the largest and most reliable source of freshwater on earth. In the United States 26% of public supplied water is from groundwater in addition to the homes supplied by private wells pump. It has always been assumed that groundwater is protected and safe, but that turns out to be less and less certain. Groundwater and surface water are connected in many ways, not all of them fully understood. Wastewater from agricultural irrigation is used to recharge groundwater and effluent discharge from wastewater treatment plants is intentionally and accidently finding its way into groundwater. In Los Angeles waste water effluent is used to recharge the groundwater, septic systems return their effluent water to groundwater and several studies by the U.S. Geological Survey (USGS) scientists Paul M. Bradley and Larry B. Barber (and others) have shown that waste water contaminants including pharmaceuticals are carried not only downstream into drinking water intakes, but into the shallow groundwater at least 65 feet from the stream.

Scientists are finding that groundwater aquifers are vulnerable to a wide range of man-made and naturally occurring contaminants. Only some of the substances have regulatory or human health screening levels. The presence of a contaminant in water does not necessarily mean that there is a human-health concern. Whether a particular contaminant in water is potentially harmful to human health depends on the contaminant’s toxicity and concentration in drinking water. Other factors include the susceptibility of individuals, amount of water consumed, and duration of exposure.

The USGS has done lots of groundwater testing over the years. In one study published in 2012 the USGS found that 10 contaminants were widely detected in groundwater and small percentage of the detections were at concentrations greater than human-health recommended levels. Of the ten contaminants, seven were from natural sources and three were man-made. The seven contaminants from natural sources included four geological trace elements (arsenic, manganese, strontium, and boron) and three radionuclides (radon, radium, and gross alpha-particle radioactivity). Radon has been considered several times for regulation in water in the past, but never seems to make the cut.

The three contaminants that exceeded MCLs from mostly man-made sources were nitrate (a nutrient), dieldrin (an insecticide that has been banned by the US EPA, but was previously used for termite control and other applications), and perchloroethene (or PCE, a solvent and degreasing agent used for drycleaning). Each of these contaminants was widely detected in groundwater tested. Nitrate occurs naturally, but most nitrate concentrations greater than 1 milligram per liter (which is one-tenth of the nitrate MCL) originates from man-made sources such as fertilizers, livestock, and human wastewater from septic systems or wastewater treatment plants. 

Installation of private wells is regulated by various state agencies, but these regulations do not require testing the groundwater for a suite of contaminants. State/local agencies that oversee private wells are usually responsible for approving the location of a well, inspecting the well after construction to verify proper grouting and adequate water yield, maintaining records of the well driller’s log, verifying the most basic potability of water by requiring at a minimum bacterial testing. In some regions of the country the Department of Health tests wells annually. 

A drinking water well that is contaminated could significantly impact your health and the value of your property. There are no national regulation and standards for testing a private well; however,  I test my drinking water for all the primary and secondary contaminants of concern to the US EPA under the Safe Drinking Water Act every few years and for a smaller list of 14 contaminants annually.

As the providers of our own water supply we need to serve as our own watch dogs, and ensure our safe water supply, no one else will. Part of the price of your own water supply is maintaining it and testing it. The local health departments have local rules and regulations for the installation of wells and can often help with testing for bacteria and nitrates which are the typical contaminants from septic systems, drain fields and livestock, but as the well owner you will need to take the initiative.

The water well test that was performed when you bought your house probably only tested for bacteria and nitrates (unless you live in New Jersey), which is inadequate to be certain that your water is safe to drink. In addition, the EPA recommends that you test your water well every year for total coliform bacteria, nitrates, total dissolved solids, and pH levels at a minimum. If you suspect other contaminants, test for those. Always use a state certified laboratory that conducts drinking water tests.

On March 14, 2023, EPA announced the proposed National Primary Drinking Water Regulation (NPDWR) for six Per- and Polyfluoroalkyl Substances (PFAS) including perfluorooctanoic acid (PFOA), perfluorooctane sulfonic acid (PFOS), perfluorononanoic acid (PFNA), hexafluoropropylene oxide dimer acid (HFPO-DA, commonly known as GenX Chemicals), perfluorohexane sulfonic acid (PFHxS), and perfluorobutane sulfonic acid (PFBS). 

In a recent study by the USGS least one PFAS (of the group tested for) was detected in 20% of private-wells (55/269 tested) and 40% of the public-supply (182/447) samples collected throughout the US. (McMahon et al., 2022). Median cumulative PFAS concentrations (estimated given the detection limits) were comparable between public-supply (median = 7.1 ng/L) and private–well point-of-use tap water (median = 8.2 ng/L ). Private well owners are going to have to address that problem, but first we need to have the public water suppliers figure it out for us and adapt their solutions to our situations. I’m still waiting for an easy to use test before I test my well.

According to the Water Systems Council, you need to monitor the condition of the wellhead and inspect the well system annually. In their publications developed in partnership with the EPA they recommend that you routinely inspect your wellhead several times a year. Check the condition of the well covering, casing and well cap to make sure all are in good repair, leaving no cracks or other entry points for potential pollutants. Note any changes in condition. In addition, you should have the well system, including the pump, storage tank, pipes and valves, and water flow, inspected every 5-10 years by a qualified well driller or pump installer. The soil types, groundwater supply and materials of construction and depth of the well will determine the life of the well. Many wells can continue to produce water supply after a pump has failed and only need a new pump to return to service. This is especially true in areas of hard water where the well pump can have a relatively short life. If you notice a change in your water pressure, it may be time to have your system inspected. Do not ignore any changes in your water supply.

A drop in water pressure can originate in the pressure tank, the pressure switch, the pump or the well and water supply. A loss of charge in the pressure tank can be caused by a leak in the bladder. Pressure to the tank is controlled by an electric switch that turns the pump on when pressure is low and off when the proper tank pressure is reached. A pressure switch can fail. In the well, a diminished water supply can be caused by drop in water level in the well due to drought or over pumping of the aquifer, iron bacteria or other buildup in the pipe, or the well could be failing or a drop in pressure could be caused by a failing or damaged pump. Of course, a drop in water pressure could just be caused by increased demand, if your pump is undersized for the number of plumbing fixtures in the house then using more than one bathroom at a time or doing laundry while hosing down the patio will cause a noticeable drop in water pressure. 

These are just examples of the kind of understanding you need to have when you operate your own water system. When you own a well, you are in charge, you are responsible, you need to be informed. 

Monday, September 4, 2023

Groundwater and a Sustainable U.S.


America is blessed with a wealth of groundwater and surface water that helped create the 20th century America with vast cities, industry and bountiful farmland. That era may be ending. We are using up our groundwater faster than it can be recharged.

The New York Time just published an excellent article: “America Is Draining Its Groundwater Like There’s No Tomorrow.” For the article the Times performed an analysis of over 84,500 of the U.S. Geological Survey (USGS) groundwater monitoring wells. They found an emerging crisis that threatens American prosperity. The Times analysis found: “Nearly half the sites (groundwater levels) have declined significantly over the past 40 years as more water has been pumped out than nature can replenish.”

The Times looked at the water level in the groundwater wells. This level in a well usually fluctuates naturally during the year. Groundwater levels tend to be highest in the early spring in response to winter snow melt and spring rainfall when the groundwater is recharged. Groundwater levels begin to fall in May and typically continue to decline during summer as plants and trees use the available shallow groundwater to grow and streamflow draws water. Natural groundwater levels usually reach their lowest point in late September or October when fall rains begin to recharge the groundwater again and the leaves fall from the trees reducing their need for water. The seasonality for a long time can disguise a diminishing groundwater supply, but over time the falling level emerges.

Though the New York Times focused on irrigated agriculture, which was the first problem to emerge. Groundwater levels are affected by how many other wells draw from the aquifer, how much groundwater is being used in the surrounding area for agricultural, industrial or commercial use, or how much groundwater is being recharged. Development of an area can impact groundwater recharge. Land use changes that increase impervious cover from roads, pavement and buildings does two things. It reduces the open area for rain and snow to seep into the ground and percolate into the groundwater and the impervious surfaces cause stormwater velocity to increase preventing water from having enough time to percolate into the earth, increasing storm flooding and preventing recharge of groundwater from occurring. Slowly, over time, this can reduce groundwater supply and the water table falls.

Increasing population density increases water use. Significant increases in groundwater use and reduction in aquifer recharge can result in slowly falling water levels that indicate that the water is being used up. Unless there is an earthquake or other geological event groundwater changes are not abrupt and problems with water supply tend to happen very, very slowly as demand increases with construction and industrial use (for data center cooling for example), and recharge is impacted by adding paved roads, driveways, commercial buildings, houses and other impervious surfaces. 

The demand for water rises as population, economic activity and agricultural irrigation grow. However, water resources of accessible water are actually decreasing, due to overuse and pollution. Water resources can be used sustainably only if their volume and variation through time are understood and managed. Groundwater availability varies by location and is limited. Precipitation and soil type determines how much the shallower groundwater is recharged annually. However, the volume of water that can be stored is controlled by the reservoir characteristics of the subsurface rocks. Groundwater may be present today in places with very dry climates because of the nature of the local geology and the historic climate cycles that have occurred through time.

A tiny group of the USGS monitoring wells examined are in Virginia. These wells measure groundwater conditions daily and can be viewed online. Only one of the Virginia wells is within the former Rural Crescent in Prince William County. That well is in the northwest corner of the county just west of Route 15 in the Culpeper groundwater basin. This area supplies groundwater for private drinking water wells, but also is an essential element in Bull Run portion of the Occoquan watershed that provides drinking water to over 800,000 people in Northern Virginia.


from USGS


from USGS

It is clear from the first USGS graph that the groundwater level in well 49V-1  has falling since before 2008. The groundwater is being used up. In the second graph you can see that for decades before that the groundwater level was fairly stable.

Virginia is dependent on groundwater. According to information from Virginia Tech, the Rural Household Water Quality program and the National Groundwater Association approximately 30% of Virginians are entirely dependent on groundwater for their drinking water. In Prince William County about a fifth of residents get their water directly from groundwater, but the health of our watersheds and stream flow are dependent on groundwater, too.  While groundwater is ubiquitous in Virginia it is not unlimited. There are already problems with availability, quality and sustainability of groundwater in Virginia in places such as Fauquier County, Loudoun County and the Coastal Plain. 


Wednesday, August 30, 2023

EV Batteries and PFAS

Yoo, Dong‐Joo, Liu, Qian, Cohen, Orion, Kim, Minkyu, Persson, Kristin A., and Zhang, Zhengcheng. Rational Design of Fluorinated Electrolytes for Low Temperature Lithium‐Ion Batteries. Germany: N. p., 2023. Web. doi:10.1002/aenm.202204182.

Lithium-ion batteries are used widely in portable electronics because of their long operation time, life span, and relatively simple manufacturing process. Lithium-ion batteries operate best at moderate temperatures.  Lithium-ion batteries have also been adopted for electric vehicle use. Lithium-ion batteries are rechargeable, lightweight, and capable of higher energy density than most other available battery types.

They are smaller than the batteries used to start gas-powered vehicles’ internal combustion engines. Of course, as well as starting the car, batteries in electric vehicles keep it moving, and they run the vehicle’s other systems like air conditioning, entertainment, and driver assistance systems.

When lithium-ion batteries are exposed to cold temperatures, their storage capacity –how much energy they can store between charges – drops to approximately 77% at around −5 °F. As the temperature falls, the storage capacity continues to fall. This happens because the ethylene carbonate used as the electrolyte solidifies at about -4 °F.

Now, however, with the spread of electric vehicles, the performance of the lithium-ion batteries at low temperatures has become an issue due to the vast performance difference depending on regions and seasons. To have the entire local transportation fleet knocked out during a polar vortex could be disastrous.

Low temperature performance is one of the most challenging aspects of lithium-ion batteries and EV adoption itself. The lithium-ion batteries used in most battery electric vehicles suffer reduced charging efficiency, significant capacity loss, and accelerated aging in low temperatures. This has a negative effect on electric vehicles’ driving range in cold climates and winter.

The batteries in the electric vehicle also power all the other systems.  Heating the cabin area of the vehicle requires significant amounts of power in cold climates (as does cooling the cabin in hot climates). Testing has shown that, because of this, electric vehicles’ range is reduced to approximately 45% when the external temperature is 5 °F or lower and recharging the batteries is slower.

The authors of the above cited study searched for a solvent that would stay liquid at low temperatures yet still form the crucil SEI barrier over the anode. They have found that replacing the left handed  terminal methyl groups of the ethyl acetate with a  trifluoromethyl group produced the desired effect.

In laboratory tests the ethyl acetate trifluoromethyl was found to be as stable in its energy storage capacity over 400 recharging cycles at  5 °F as the battery containing ethyl acetate was at room temperature. While this solves the problem of EV batteries in winter weather, it potentially adds another PFAS (a forever chemical) to wide spread use. Are we destroying ourselves and planet to reduce greenhouse gasses. The researchers have applied for a patent. You can read the research at the link above.

Sunday, August 27, 2023

SCC challenges Dominion’s IRP Assumptions

In May when Dominion Energy filed its 2023 Integrated Resource Plan (IRP) with the State Corporation Commission (SCC) it essentially showed that Virginia plans to decarbonize the grid under the VCEA had collided with the exploding demand of the unconstrained growth of the data centers in Northern Virginia. The IRP is meant to guide the SCC decisions about Dominion’s generation fleet- building new generation and shutting down old generation.

Just to refresh your memory, the 2020 VCEA is the state’s law outlining a path to decarbonize the electric grid by 2050. VCEA requires the Commonwealth to retire its natural gas power plants by 2045 (Dominion) and 2050 (Appalachian Power). These facilities currently comprise 67% of the current baseload in-state generation as well as 100% of the power plants that meet peak demand. About 30% of Virginia’s generation is from nuclear. When the VCEA was crafted, they did not foresee the explosive demand for electricity that unconstrained data center development would drive.

The IRP plan presented would increase Dominion’s  carbon emissions from current levels, instead of dropping to zero by 2040, as required under the VCEA. In the IRP submitted to the SCC Dominion forecasted that power demand would rise 80% and that peak load will rise from a bit more than 17,000 megawatts now to 27,000 megawatts by 2037. You cannot plan that amount of electricity demand growth 10 years while eliminating generation capacity. It has never been done, and Dominion admits that they need to not only keep all their fossil fuel power generation operating, but  are asking to build more dispatchable fossil fuel generation to meet this forecast demand.

Their plan to do that requires over 4,500 MW of incremental energy storage and more than 3,000 MW of incremental Small Modular Nuclear, SMR, (which is still in the demonstration stage where costs on the first prototype plant have risen more than 50%). Even with these additional resources, Dominion would have to purchase 10,800 MW of additional capacity from PJM in 2045 and beyond, raising significant concerns about system reliability and energy independence, including over-reliance on out of-state capacity to meet customer needs. This Plan will also require a substantial increase in energy purchase limits from both PJM and the SCC.

Now, the SCC is examining and challenging the growth assumptions that went into that forecast. As reported in the Richmond Times Dispatch, the SCC hired a consultant, Bernadette Johnson, to examine the growth forecast. “She said the increase Dominion forecasts is larger than the actual growth her firm has measured in Texas' self-contained electric grid, where increases have been driven by data center expansion, cryptocurrency operations and faster overall job growth than Virginia sees.

In addition, the Times Dispatch reports that “environmental groups and a clean energy trade association told the SCC that Dominion's electricity demand forecast is based on an unrealistic view.”   So, it remains to be determined whose forecast is correct. However, the New York Times reported Nvidia sales results soaring and sales jumping 101% year over year to$13.5 billion.  “Nvidia’s roaring sales contrasted sharply with the fortunes of some of its chip industry peers, which have been hurt by soft demand for personal computers and data center servers used for general-purpose tasks…

Some analysts believe that spending on A.I.-specific hardware, such as Nvidia’s chips and systems that use them, is drawing money away from spending on other data center infrastructure. IDC, a market research firm, estimates that cloud services will increase their spending on server systems for A.I. by 68% over the next five years.”

I would like to point out that crypto currencies are very different than deployment of A.I. and hyperscale data centers. In a demand response program, cryptocurrency miners simply shut down and get paid for stepping off the grid. At the right price they are happy to oblige the grid operators. The Texas grid is larger and the data centers represent a smaller proportion of the overall.  Data centers cannot as easily step off the grid, they must keep operating and require a viable backup. We are more and more dependent on the internet.  If the day comes when A.I. is operating much of our infrastructure control, automobiles etc. there will be no ability for the data centers to shut down in a demand response program. 

A recent Harvard Business Review article by Ajay Kumar and Tom Davenport stated that: “The data center industry… is responsible for 2–3% of global greenhouse gas (GHG) emissions. The volume of data across the world doubles in size every two years. The data center servers that store this ever-expanding sea of information require huge amounts of energy and water (directly for cooling, and indirectly for generating non-renewable electricity) to operate computer servers, equipment, and cooling systems. These systems account for around … 2.8% of the United States’ electricity use.

AI models are generated by “hyperscale” (very large) cloud providers with thousands of servers that produce major carbon footprints; in particular, these models run on graphics processing unit (GPU) chips. These require 10–15 times the energy a traditional CPU needs because a GPU uses more transistors in the arithmetic logic units. Currently, the three main hyperscale cloud providers are Amazon AWS, Google Cloud, and Microsoft Azure.”

What has happened decade after decade in the tech industry is that growth surprises, demand soars, back logs build, companies double or triple order to ensure their growth and then demand slows and sometimes crashes. In data centers, the demand may be not yet at the peak, past the peak, or about to take off for A.I. specialty data centers. My crystal ball is cloudy, but we do know that Virginia has given up control and management of the situation to the unrestrained approval for the building of data centers and the requirement the power be delivered on request. Which future will be ours- empty shells decaying along the road or a county crisscrossed by power lines energized by fossil fuels?

Wednesday, August 23, 2023

Fukushima Begins Releasing the Stored Water

 

water tanks at TEPCO Fukushima Plant

The Tokyo Electric Power Company (TEPCO) and the Japanese government announced on Tuesday that the operation to release the filtered and stored groundwater at the Fukushima nuclear plant would begin on Thursday and it did. (Today already happened in Japan.)

After staging a successful test on Tuesday, taking a sample of about 1 cubic meter of treated water and diluting it with about 1,200 cubic meters of seawater. The treated and diluted water was then tested to verify the treated water had been diluted as expected. The tritium concentration were measured to confirm that it is less than 1,500 becquerels per liter.

Now TEPCO will begin diluting large amounts of treated water from storage tanks and  releasing the diluted treated water into the ocean. The Japanese National Federation of Fisheries Cooperative Associations has continued to oppose the water release plan concerned about the impact the reputation of seafood from Fukushima and nearby areas. It was reported that despite all the preparation and investigation China issued a partial import ban on Japanese seafood by Hong Kong and Macau.

On March 11, 2011, a magnitude 9.1 earthquake struck off the northeast coast of Honshu, Japan, generating a deadly tsunami. Systems at the Fukushima nuclear plant detected the earthquake and automatically shut down the nuclear reactors. Emergency diesel generators automatically turned on to keep coolant pumping around the nuclear cores to try and keep them cool.

But soon after the tsunami wave which was over 46 feet high hit Fukushima. The water overwhelmed the defensive sea wall, flooding the plant and knocking out the emergency generators. Workers rushed to restore power, but in the days that followed the nuclear fuel in three of the reactors overheated and suffered a nuclear meltdown  where the nuclear cores were partly melted.

The Fukushima nuclear disaster released radioactive materials into the environment and forced thousands of people to evacuate their homes. Ever since 2011 crews have continuously pumped water through the destroyed reactors to keep the nuclear cores cool. In addition water flows naturally from the mountain towards the sea.

Approximately 150 tons of groundwater, which naturally runs from the mountain side to the ocean, flows into the reactor buildings cools the reactor cores and become newly contaminated water. Various countermeasures are taken (filtration to remove radionuclides) and storage to prevent  the contaminated water from flowing out to the port or that the contaminated water may leak from the storing tanks (secondary containment measures).

The Tokyo Electric Power Company (TEPCO) which owns the nuclear plant has been pumping, filtering and storing the water in tanks at the plant. Now, they say that they are running out of space to store the water on land. Last summer TEPCO obtained the approval of the International Atomic energy Agency (IAEA) for a plan to begin releasing the stored water into the Pacific Ocean. The plan is to release the stored water sometime this year.

IAEA Director General Grossi accepted Japan’s invitation and appointed a Task Force of independent experts and IAEA staff to carry out the three-pronged review – regulatory, technical and independent sampling and analysis – against international safety standards. These safety standards reflect an international consensus and serve as a global reference for protecting people and the environment from the harmful effects of ionizing radiation. In January the IAEA Task Force completed their second regulatory reviews in Japan.

No one is taking this lightly. The TEPCO crews have continued to pump the groundwater through the wrecked reactors to constantly cool the melted nuclear fuel. This cooling water picks up radiation in the form of radio nuclides. The water is then passes through a specialty filtering process to remove and capture much of the radiation, but the process does not effectively capture tritium because tritium forms water molecules and no filtration process is perfect. Tritium is a hydrogen atom that has two neutrons in the nucleus and one proton. Though produced naturally in the upper atmosphere, Tritium is also produced as a byproduct in nuclear reactors and nuclear explosions.

TEPCO will gradually release up to 22 trillion becquerels of tritium per year from the Fukushima Nuclear Power Station over the next 20 or 30 years. The level of tritium in the water that will be released from the Fukushima Nuclear Power Station is below the maximum amount of tritium in drinking water recommended by the World Health Organization (10,000 becquerel per liter).  Tritium has a 12 year half life and gives off only low-energy beta particles that are believed to pose limited risks for marine life and humans. However, there are limits to the ability of the Ocean to sustainability dilute the concentration of residual contamination. Tritium levels will be monitored and reported on the TEPCO website.

Sunday, August 20, 2023

The Eroding Financial Strength of the US

 The United States has set a goal to reach 100 % carbon pollution-free electricity by 2035 and net zero emissions throughout the economy by 2050. The President also pledged an interim goal of a 50-52% reduction from 2005 levels in economy-wide net greenhouse gas pollution by 2030. These goals are pipe dreams as much of the promises made under the Paris Agreement seem to be. The U.S. Energy Information Administration, EIA, is forecasting that by 2030energy-related CO2 emissions fall to 25% to 38% below 2005 levels. 

The window for limiting global temperature to 1.5 degrees Celsius has probably closed. CO2 emissions from fuel have continued to grow year after year with the exceptions of a brief respite during the global financial crisis and the Covid-19 lockdowns. World CO2 emissions have resumed their climb. Total CO2 emissions for planet earth reached 40.6 billion tonnes of CO2 (GtCO2) in 2022.

“The scientific community has made clear that the scale and speed of necessary action is greater than previously believed.  There is little time left to avoid setting the world on a dangerous, potentially catastrophic, climate trajectory.  Responding to the climate crisis will require both significant short-term global reductions in greenhouse gas emissions and net-zero global emissions by mid-century or before.”

“It is the policy of my Administration to organize and deploy the full capacity of its agencies to combat the climate crisis to implement a Government-wide approach that reduces climate pollution in every sector of the economy; increases resilience to the impacts of climate change; protects public health; conserves our lands, waters, and biodiversity; delivers environmental justice; and spurs well-paying union jobs and economic growth, especially through innovation, commercialization, and deployment of clean energy technologies and infrastructure…” 

So, our nation is on a mission to reduce the greenhouse gas emissions of our natin and prepare for and respond to climate change. Climate change is also increasing the frequency  and supercharging the  intensity of drought, lengthening wildfire seasons in the Western states, and the potential for extremely heavy rainfall becoming more common in the eastern states. Sea level rise is worsening hurricane storm surge flooding. In 2022, the U.S. experienced 18 separate weather and climate disasters costing at least 1 billion dollars with the total reaching $165 billion.

from NOAA

According to NOAA, the number and cost of weather and climate disasters are increasing in the United States due to a combination of increased  population and material wealth over the last several decades exacerbated  by the fact that much of the growth has taken place in vulnerable areas like coasts, the wildland-urban interface, and river floodplains.

The problem is that as a nation we are spending our way into poverty and reducing our ability to afford to respond to disasters and protect our citizens. Since the global financial crisis in 2008 when our national debt stood at $13.6 Trillion it has ballooned to $32.7 Trillion today. In 2007 before the global financial crisis our annal budget deficit was $0.16 Trillion, the following year it ballooned to $0.45 Trillion, and just simply never returned to the pre 2008 levels. In 2020 the budget deficit was $3.13 Trillion dollars, currently it is $1.61 Trillion.  Our debt has exceeded our GDP. 

from Treasury.gov




from Treasury.gov

from Treasury.gov

Wednesday, August 16, 2023

Carbon Offsets and Regulators

Carbon credits or offsets and their related markets provide tools for tools for organizations seeking to reduce their carbon footprint and comply with both voluntary and mandatory emissions reduction goals. Carbon credits or offsets represent carbon emission reductions or removal and are traded in on various exchanges or markets. 

Carbon markets allow carbon emitters to purchase credits that are awarded to projects that remove or reduce atmospheric carbon.  These credits offset their emissions to reach their voluntary commitment to reduce “net” emissions. Each carbon credit typically corresponds to one metric ton (tonne) of reduced, avoided or removed carbon dioxide or equivalent greenhouse gas. 

Offsets are a popular tool that companies use to reduce their net greenhouse gas (GHG) emissions and live up to their environmental, social and governance (ESG) goals, as well as promises made to customers and consumers. By purchasing carbon offsets, businesses believe they are financing renewable energy projects that remove GHG emissions from the atmosphere or avoid GHG emissions – such as commitments to preserve forests or the construction of facilities to capture carbon emissions – without being involved directly in these projects.

These voluntary markets can be distinguished from “compliance” carbon markets, which is the term for systems where a government or regulator issues a carbon allowance that participants must not exceed unless they can purchase additional compliance allowances from another participant under the cap-and-trade program.

The Inflation Reduction Act (IRA) is a far-reaching law includes provisions to “finance green power, lower costs through tax credits, reduce emissions, and advance environmental justice.” The law states that the IRA is intended to reduce U.S. carbon emissions by roughly 40% by 2030 and to reach a net-zero economy by 2050. The passage of the IRA has inspired greater regulatory scrutiny, or the carbon credit markets to avoid greenwashing.

Commodity Futures Trading Commission (CFTC), Securities and Exchange Commission (SEC) and the Federal Trade Commission (FTC) have all proposed updated rules to address deceptive claims about the use of carbon offsets. The CFTC is exerting jurisdiction over fraud and manipulation in "physical" carbon markets and recently created an Environmental Fraud Task Force. The FTC has proposed updates to its Green Guides to address deceptive claims about the use of carbon offsets.

The CFTC recently created an Environmental Fraud Task Force to examine fraud and other misconduct in regulated and voluntary carbon markets. In particular, the CFTC is interested in:

  • manipulative and wash trading or other violations in carbon market futures contracts
  • fraud in markets related to ghost or illusory carbon offsets listed on carbon market registries
  • double counting or other fraud related to carbon offsets when the same offset is claimed by more than one entity without an additional carbon benefit
  • fraudulent statements relating to material terms of the carbon offset
  • manipulation of tokenized carbon markets
  • Fraudulent claims that offsets are in addition to any reductions that would have occurred in a business-as-usual scenario or as required by law.

The FTC will use its broad statutory authority over unfair and deceptive practices with respect to environmental claims. The FTC is finalizing standards for the “Use of Environmental Marketing Claims” to provide clarity and stricter guidance for claims made by using carbon offsets that products or businesses are carbon-neutral, have net‑zero emissions, or are low‑carbon or carbon-negative.

There is currently no legal requirement that companies verify the quality of offsets used to make climate claims, although many do voluntarily verifying through organizations such as the Integrity Council for the Voluntary Carbon Market (ICVCM). 

Sunday, August 13, 2023

The Cold tongue, El Nino, and climate change

Two years ago, the U.N. Intergovernmental Panel on Climate Change (IPCC) released their latest report on Climate Change   in which the IPCC greatly narrowed the likely future temperature rise. Nonetheless, emissions of carbon dioxide have been rising by about 1% per year on average for the past decade (with a slight pull back during the pandemic). Though, renewable energy use has been expanding rapidly, much of the renewable energy is being deployed alongside existing fossil energy, not replacing it.  

All the climate models tie the rise in global temperatures to concentrations of atmospheric carbon dioxide and this is still happening. The planet has warmed 1.1 degrees C since the late 19th century and is expected to warm an additional 0.4 degrees C in the next 20 years. Just to make this point, this past July has been reportedly the warmest month in history.

Mankind in their burning of fossil fuels and covering the earth with concrete is responsible for this rise in temperature. No actions that nations are likely to take can change this trajectory we only have some hope of moderating it.  That was the key finding of the IPCC report. Total CO2 emissions for planet earth have passed 40.6 billion tonnes of CO2 (GtCO2) per year.

The Pacific, the largest ocean on earth, has a surface larger than all the continents combined. The weather on the planet is impacted by the El Nino Southern Oscillation where the winds across the Pacific Ocean move the planet between La Nina and El Nino conditions every few years and the Decadal Oscillation a much longer pattern that appears to happen over 20-30 years.

As pointed out in an excellent article by Madeleine Cuff last week in New Scientist. The massive climate models created to model our plant predict that as a result of climate change, the surface of the Pacific Ocean should be warming, but hidden in the natural large variability of the Pacific Ocean is an oddity.  Between 1980 and 2022 the planet’s sea surface temperatures increased faster than the Earth’s surface temperature. However, there is an area in the eastern Pacific Ocean emanating from the coast of south America that has been cooling defying all the models. That area is known as the cold tongue.

While the eastern Pacific Ocean has always been (as far as we know) cooler than the western Pacific Ocean, this difference has increased by about 10% (about half a degree Celsius). This oddity may have significant impact on how quickly the planet warms and the resultant weather patterns. If the eastern Pacific Ocean were to suddenly flip to a warming pattern, this could change the base state of the climate to La Nina conditions. Climate resilience plans would need to change to respond to unanticipated extensive and permanent drought in the U.S. Southwest and the Horn of Africa.

Scientists do not understand the cause of the cold tongue and how long it may continue. It matters, in understanding the future planet climate conditions and how severe they are. If it continues the cold tongue could reduce the anticipated global warming by about 30% according to sone researchers. Many efforts have been made to reconcile the discrepancies between climate model projections and real world observations (Solomon and Newman 2012, L'Heureux et al 2013a; Luo et al 2018, Chung et al 2019) with various theoretical arguments. 

Researchers at Lamont-Doherty Earth Observatory at Columbia University have been studying the Pacific Oscillation and climate modeling. Their work indicates that not all climate models include Antarctic meltwater in their calculations and some have trouble correctly reflecting changes to sea temperatures, winds and currents in the Southern Ocean. As a result, warming projections for this century by current global climate models may be overestimated. That would be good news.

Other scientists believe that the current climate models will be proven correct eventually. The eastern Pacific will eventually flip to warming because of all the greenhouse gasses mankind continues to pump into the atmosphere. Now, to try and solve the mystery an international working group formed this year to study the cold tongue. To predict what will happen next, we first need to understand what is happening now.

Read the research from Lamont-Doherty Earth Observatory, the linked studies above and the full article in New Scientist if you are interested in learning more.

Wednesday, August 9, 2023

Virginia SCC approves a Demand Response Program for Dominion

The Virginia State Corporation Commission (SCC) has approved Dominion Energy request for five new "demand-side" management programs intended to be elements in the utility’s efforts to maintain the stability of the grid as they decarbonize and to reduce the cost on customers.

from SCC Dominion Energy VA typical bill

According to a report in the appendix of the SCC hearing, these new programs expands on the existing pilot program that allowed 10,000 customers. This was called an off-peak plan and  was originally approved in 2021. It used a “Time Of Use” rate that charged 10,000 participating customers more for electricity during peak hours while saving them money during times when people don’t traditionally use energy. That program saved customers on average $17 a year. In addition, customers used 9.4% less energy during peak times in the summer and 2.9% less during the winter. 

However in the report, what were termed “high-baseline” customers, or those who must reduce energy during peak times to save money, saw their bills increase. Meanwhile, “structural winners,” or those who simply benefit from the rate changes without having to alter their habits, experienced all the savings. As the Attorney General’s Office pointed out in a letter to the SCC, that may prove to be an unsustainable design- encouraging only structural winners to participate while those customers with the greatest potential for load shifting abstain, to avoid higher costs.  With growing electric bills, people could also learn to adjust their power usage to save money.

According to  Dominion Energy’s web site they will expand program to reward participating business customers for reducing their electricity use during times when our power grid is experiencing heavier than normal use.

Our Non-Residential Distributed Generation Program partners with interested customers to switch their power source from the Dominion grid to a backup generator for a limited number of hours each year. In return, the customer receives a monthly incentive payment based on their reduced power consumption.”

The program will provide participating customers a monthly incentive to allow their on-site backup generators to be remotely activated by PowerSecure during load curtailment events (a "control event") for up to a total of 120 hours per year. The monthly incentive is based on the amount of load curtailed, the amount of fuel consumed during load curtailment events, and/or tests requested by Dominion Energy Virginia during the month.

Monthly Participation Payment = Load Curtailment Capability Payment + Fuel Payment + Variable Operations & Maintenance Adder

It seems that slowly, but surely Dominion is moving towards having large industrial flat demand users (data centers) becomes dispatchable micro grids to protect the main PJM grid. Microgrids are self-contained electrical networks that draw from on-site energy sources. These supplement grid power to keep the data center online in the case of a shortfall of power to prevent a grid outage. Diesel backup generators  have been the norm in Virginia. However, diesel is the dirtiest source of power for a micro grid and frankly after this summer of awful air I want better. As the backup power for the data centers becomes the emergency dispatchable power for our region we must have cleaner power. We can no longer allow diesel generators to be the backup power for data centers. This dispatchable power must be natural gas, fuel cells, or other energy storage combined with on-site storage.  

Sunday, August 6, 2023

Digital Gateway Rezoning is Coming

 On November 1, 2022, the Prince William Board of County Supervisors adopted Comprehensive Plan Amendment for the PW Digital Gateway. They did this without performing a watershed study as requested by Fairfax County and Fairfax Water.  The Digital Gateway Development could very likely endanger the Occoquan watershed, the most urbanized watershed in the nation and currently experiencing degradation; and the Occoquan Reservoir, the source of water for 800,000 Prince William and Fairfax residents. The data center companies and landowners and seemingly the democratic majority of our Prince William County Supervisors could not care less. To move forward, the proposed data centers have to be rezoned and get a Special Use Permit, SUP.

Now, three rezoning applications have been submitted within the area of PW Digital Gateway Comprehensive Plan Amendment, Digital Gateway North, Digital Gateway South, and Compass Datacenters. The proposals also request a waiver to requirements that the proposed data centers must have an approved SUP. The rezoning request lacks a detailed layout of the site, despite detailed requests for additional information from PWC staff. The proposed rezoning requests are too general and do not provide sufficient details to even determine the location of site features and resource protected areas under the Chesapeake Bay Protection Act. It appears that the data centers propose using the RPA as the path for the power lines. That is forbidden by Virginia Law- the Chesapeake Bay Protection Act.  

For waiver of the SUP requirement, PWC staff would need the same level of detail in the rezoning request as would be required with the SUP, and that all relevant impacts should be appropriately mitigated to protect our water, our grid and our citizens as a SUP would. Yet, the data center developers think that on such a large and complicated site, asking for this level of detail is unreasonable. They failed to submit the requested information, address the impacts to the properties to the west, or develop adequate mitigations to impact to the Battlefield and historical resources.

Data centers are the physical factories of the internet. Standard data centers are warehouses filled with row upon row of servers, routers, wires, and other information technology hardware spanning hundreds of thousands of highly cooled square feet per building and sucking up incredible amounts of power. Now we have the emerging demand for AI data centers. These specialized data centers run high performance chips like the Nvidia graphics processing units that use seven times the power of traditional data centers. This requires additional power infrastructure, and the extra power generates more heat and requires liquid cooling to prevent the equipment from overheating. The applicants have not submitted the maximum daily water demands and peak wastewater flows for each phase of development, so the hydraulic capacity studies by the PW Service Authority cannot be completed.

The intension to migrate these 2.5 million square feet of data centers to AI might be in the proposal for water usage. The comment from PWC staff: “The CPA policies encourage efficient water usage for data center development using closed loop water, or no water-cooling systems. What is proposed is not consistent with the CPA policies.” Furthermore, the Applicants are proposing increasing the height limits to 100 feet for the building and not counting the roof mounted cooling equipment in that height calculation. Though, the view shed analysis was done using a lower height and only done for points in the Battlefield not the western property boundary.  

Though the rezoning request provides numerous notes and labels throughout there are no assurances that proposed improvements, standards, and locations of these items will be complied with. The Applicant describes items as “typical” rather than specific for the plan.  In addition, substation locations shown are “approximate.” Limits of development shown are “approximate.” Wildlife crossings shown are “approximate.” Basically, the data center Applicants are saying trust us. It’s a blank check and they can don anything they want -endangering our environment, water supply and our power supply.

Last spring Dominion Energy released their Integrated Resource Plane (IRP) which details how it plans to meet electricity needs and demands of their customers over the next 15 years. The picture they painted is that Dominion cannot both meet the power demand of the exploding number of data centers in Virginia and the mandates of the Virginia Clean Economy Act (VCEA). The 2020 VCEA is the state’s law outlining a path to decarbonize the electric grid by 2050. With the specter of seven times the power usage Dominion may not even be able to meet the power demand.

Dominion Energy Virginia does not produce all the electricity it delivers and sells. Dominion Energy is a member of PJM Interconnection, LLC (PJM), the regional transmission organization coordinating the wholesale electric grid in the Mid-Atlantic region of the United States. PJM other members supply a significant amount of the energy used in Virginia- about a fifth. PJM recently identified increasing reliability risks due to both the growing demand for power in Virginia and the profile of that power demand and from the premature retirement of dispatchable carbon generation facilities across the region.

The load growth of the Data Center in Northern Virginia is according to PJM will grow at 7% annually. That means that electricity demand will more than double over the next 10 years for traditional data centers only, it will be multiples of that for AI data centers.

In the rezoning request for the Digital Gateway the Applicants treated the specific details requested by staff to ensure that the plans comply with federal and state law, regulations and rules was too much on such a large scale development. They ignored them. These rules and regulations exist for a purpose-to protect our water supply, our grid, our environment and us. The details are necessary.  Afterall the devil’s in the details.

Wednesday, August 2, 2023

The PJM

I know that I refer to the PJM fairly often lately, but its been quite a while since I covered the basics power, generation and the PJM. The electricity that powers our lives- charges our phones, powers the internet, equipment, lights, homes, office,  air conditioning and soon everything else is there when we need it because of the power grid, an interconnected system that keeps electricity flowing to our homes and businesses every moment of every day. PJM Interconnection, a regional transmission grid operator, works behind the scenes to ensure the reliability of the power grid and to keep the lights on.

PJM began in 1927 when three utilities formed the world’s first continuing power pool to gain the benefit from combining their resources. These utilities connected to each other through high voltage transmission lines, creating an interconnection of resources. Interconnection is a two-way street allowing those who are connected to the grid to take or give power, share resources back and forth as needed. The PJM grid once covered Pennsylvania, New Jersey and Maryland (the source of the initials PJM); but today, the PJM extends into13 states and the District of Columbia: Delaware, Illinois, Indiana, Kentucky, Maryland, Michigan, New Jersey, North Carolina, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia.

PJM is a regional transmission organization, RTO; and is an independent system operator (ISO) that takes responsibility for grid operations, reliability, and transmission service within their defined geographical region. The North American transmission grid includes all of the United States and most of Canada. It is made up of the nine major RTOs/ISOs. To ensure that these organizations operate efficiently and reliably, the United States government empowers the North American Reliability Corporation (NERC) to establish and enforce performance standards for all of them, except for the Texas grid, which the state of Texas regulates separately.

The large size of PJM’s market area increases the diversity of resources available to meet consumer need, providing benefits to both market participants and consumers and enhancing the reliability of the grid. Power pooling provides PJM members the benefit of drawing from electricity resources across a broad geographic area, weather (not everyone is hit with the same storms), and generation sources. This means that if one area is short on resources, resources can be brought in from a different area, to ensure grid reliability.

While some large-scale battery storage options are now available, many are reserved for emergency situations; the vast majority of electricity must be used when it is generated. For this reason, PJM must have generators up and running to supply electricity when outdoor temperatures soar and customers “demand” more power to run air conditioners, or when utilities restore power to a large number of customers after a major storm. Thus, the charts I have occasionally put up of PJM’s forecast of power needs for the next 24-48 hours. PJM forecasts the demand and makes sure the power generation is available to meet that need. 



Sunday, July 30, 2023

PFAS in 20% of Private Wells

The below summarizes and quoter the finding and results from the USGS study

from USGS

On March 14, 2023, EPA announced the proposed National Primary Drinking Water Regulation (NPDWR) for six Per- and Polyfluoroalkyl Substances (PFAS) including perfluorooctanoic acid (PFOA), perfluorooctane sulfonic acid (PFOS), perfluorononanoic acid (PFNA), hexafluoropropylene oxide dimer acid (HFPO-DA, commonly known as GenX Chemicals), perfluorohexane sulfonic acid (PFHxS), and perfluorobutane sulfonic acid (PFBS). EPA anticipates finalizing the regulation by the end of 2023.

PFAS do not occur in nature, they are an entirely synthetic substance. Yet, most people in the United States have been exposed to PFAS and have PFAS in their blood, especially perfluorooctane sulfonic acid (PFOS) and perfluorooctanoic acid (PFOA). There are thousands of PFAS chemicals, and they are found in many different consumer, commercial, and industrial products. This category of chemical has been widely used for over 80 years mainly for their ability to repel oil, grease, water, and heat.

We have all been exposed to PFAS in everyday life. Stain-resistant carpeting, nonstick cookware, grease- and water-proof food packaging, fabric softeners, waterproof clothing, cosmetics, and through our diet and water. These forever chemicals are washed out of our clothing, carpeting, pans, skin and end up in our wastewater. There are numerous sources of exposure including: industrial emissions, PFAS-containing consumer products, contaminated drinking and surface water, house dust and food.

Though very water soluble, PFAS are resistant to degradation and simply flow through the wastewater treatment plant or septic leach field. PFAS remains in the biosolids and effluent. That is how it has spread throughout society and into our food supply. When the U.S. EPA was developing the regulations they utilized the Unregulated Contaminant Monitoring Rule (UCMR) program to collect data for contaminants suspected to be present in drinking water, but that do not yet have health-based standards set under the Safe Drinking Water Act (SDWA).  

EPA had public water systems serving more than 10,000 people gather data on a handful of  PFASs. The EPA found 4% of the large US drinking-water treatment plants tested had detectable PFAS. However, this probably vastly underestimated the extent of contamination because of the high level of detection limits (10–90 ng/L depending on individual PFAS) used in the analysis in that testing and a limited number of PFAS tested for (Hu et al., 2016).

The national testing programs, the UCMR3 focused only on community water supplies serving ≥ 10,000 consumers, and did not include private-wells and information from rural communities (52 million people rely on small water supplies serving < 10,000).  Data on PFAS exposure and potential human-health effects is does not exist for over one-third of the US population- the 40 million on private wells and the 52 million who get their water from small community water systems. 

There is limited information is available on PFAS concentrations at the  point-of-use tap water for all users. Most of the drinking-water studies only looked at samples from the source waters (McMahon et al., 2022; Sims et al., 2022) or pre-distribution samples from community water supplies. The distribution system and plumbing materials contribution was largely ignored; and there was a lack of data available for private-wells across the US.  Now the U.S. Geological Survey has completed a multi-year sampling program and created a model to estimate the probable concentrations of 32 PFASs at point of use for  water systems and private wells.

 The study tested for 32 individual PFAS compounds using a method developed by the USGS National Water Quality Laboratory. The most frequently detected compounds in this study were PFBS, PFHxS and PFOA. The interim health advisories released by the EPA in 2022 for PFOS and PFOA were exceeded in every sample in which they were detected in this study because the level of detections was higher than the final regulatory limit. So, the probability model conservatively estimates the occurrence of those chemicals. 

Scientists collected tap water samples from 716 locations representing a range of low, medium and high human-impacted areas. The low category includes protected lands; medium includes residential and rural areas with no known PFAS sources; and high includes urban areas and locations with reported PFAS sources such as industry or waste sites.  

At least one PFAS was detected in 20% of private-well (55/269) and 40% of the public-supply (182/447) samples collected throughout the US. A similar pattern was reported in groundwater from the eastern US, in which 60% of the public-supply wells and 20% of monitoring wells contained at least one PFAS (McMahon et al., 2022). Median cumulative PFAS concentrations (estimated given the detection limits) were comparable between public-supply (median = 7.1 ng/L) and private–well point-of-use tap water (median = 8.2 ng/L ).

I am one of the approximately 40 million people in the US that rely on private-wells for drinking-water and responsible for maintaining a safe water supply for my family. I plan to test my well as soon as a reliable test below the regulatory limit is commercially available and the labs have gotten more experience with the sample handling protocols. Don’t panic. If you feel you must do something, now. Then you could try testing your well yourself. Many laboratories and universities offer water quality testing services to homeowners. Most of these institutions will ship you a sampling kit and you return the samples to them for analysis. Four that I know of are: Cyclopure, Tap Score, Freshwater Future, and WaterCheck. The Virginia Tech water clinics do not yet test for PFASs. Also, the most effective removal system is reverse osmosis, but the disposal of any PFAS removed is problematic at this time.