Wednesday, August 30, 2023

EV Batteries and PFAS

Yoo, Dong‐Joo, Liu, Qian, Cohen, Orion, Kim, Minkyu, Persson, Kristin A., and Zhang, Zhengcheng. Rational Design of Fluorinated Electrolytes for Low Temperature Lithium‐Ion Batteries. Germany: N. p., 2023. Web. doi:10.1002/aenm.202204182.

Lithium-ion batteries are used widely in portable electronics because of their long operation time, life span, and relatively simple manufacturing process. Lithium-ion batteries operate best at moderate temperatures.  Lithium-ion batteries have also been adopted for electric vehicle use. Lithium-ion batteries are rechargeable, lightweight, and capable of higher energy density than most other available battery types.

They are smaller than the batteries used to start gas-powered vehicles’ internal combustion engines. Of course, as well as starting the car, batteries in electric vehicles keep it moving, and they run the vehicle’s other systems like air conditioning, entertainment, and driver assistance systems.

When lithium-ion batteries are exposed to cold temperatures, their storage capacity –how much energy they can store between charges – drops to approximately 77% at around −5 °F. As the temperature falls, the storage capacity continues to fall. This happens because the ethylene carbonate used as the electrolyte solidifies at about -4 °F.

Now, however, with the spread of electric vehicles, the performance of the lithium-ion batteries at low temperatures has become an issue due to the vast performance difference depending on regions and seasons. To have the entire local transportation fleet knocked out during a polar vortex could be disastrous.

Low temperature performance is one of the most challenging aspects of lithium-ion batteries and EV adoption itself. The lithium-ion batteries used in most battery electric vehicles suffer reduced charging efficiency, significant capacity loss, and accelerated aging in low temperatures. This has a negative effect on electric vehicles’ driving range in cold climates and winter.

The batteries in the electric vehicle also power all the other systems.  Heating the cabin area of the vehicle requires significant amounts of power in cold climates (as does cooling the cabin in hot climates). Testing has shown that, because of this, electric vehicles’ range is reduced to approximately 45% when the external temperature is 5 °F or lower and recharging the batteries is slower.

The authors of the above cited study searched for a solvent that would stay liquid at low temperatures yet still form the crucil SEI barrier over the anode. They have found that replacing the left handed  terminal methyl groups of the ethyl acetate with a  trifluoromethyl group produced the desired effect.

In laboratory tests the ethyl acetate trifluoromethyl was found to be as stable in its energy storage capacity over 400 recharging cycles at  5 °F as the battery containing ethyl acetate was at room temperature. While this solves the problem of EV batteries in winter weather, it potentially adds another PFAS (a forever chemical) to wide spread use. Are we destroying ourselves and planet to reduce greenhouse gasses. The researchers have applied for a patent. You can read the research at the link above.

Sunday, August 27, 2023

SCC challenges Dominion’s IRP Assumptions

In May when Dominion Energy filed its 2023 Integrated Resource Plan (IRP) with the State Corporation Commission (SCC) it essentially showed that Virginia plans to decarbonize the grid under the VCEA had collided with the exploding demand of the unconstrained growth of the data centers in Northern Virginia. The IRP is meant to guide the SCC decisions about Dominion’s generation fleet- building new generation and shutting down old generation.

Just to refresh your memory, the 2020 VCEA is the state’s law outlining a path to decarbonize the electric grid by 2050. VCEA requires the Commonwealth to retire its natural gas power plants by 2045 (Dominion) and 2050 (Appalachian Power). These facilities currently comprise 67% of the current baseload in-state generation as well as 100% of the power plants that meet peak demand. About 30% of Virginia’s generation is from nuclear. When the VCEA was crafted, they did not foresee the explosive demand for electricity that unconstrained data center development would drive.

The IRP plan presented would increase Dominion’s  carbon emissions from current levels, instead of dropping to zero by 2040, as required under the VCEA. In the IRP submitted to the SCC Dominion forecasted that power demand would rise 80% and that peak load will rise from a bit more than 17,000 megawatts now to 27,000 megawatts by 2037. You cannot plan that amount of electricity demand growth 10 years while eliminating generation capacity. It has never been done, and Dominion admits that they need to not only keep all their fossil fuel power generation operating, but  are asking to build more dispatchable fossil fuel generation to meet this forecast demand.

Their plan to do that requires over 4,500 MW of incremental energy storage and more than 3,000 MW of incremental Small Modular Nuclear, SMR, (which is still in the demonstration stage where costs on the first prototype plant have risen more than 50%). Even with these additional resources, Dominion would have to purchase 10,800 MW of additional capacity from PJM in 2045 and beyond, raising significant concerns about system reliability and energy independence, including over-reliance on out of-state capacity to meet customer needs. This Plan will also require a substantial increase in energy purchase limits from both PJM and the SCC.

Now, the SCC is examining and challenging the growth assumptions that went into that forecast. As reported in the Richmond Times Dispatch, the SCC hired a consultant, Bernadette Johnson, to examine the growth forecast. “She said the increase Dominion forecasts is larger than the actual growth her firm has measured in Texas' self-contained electric grid, where increases have been driven by data center expansion, cryptocurrency operations and faster overall job growth than Virginia sees.

In addition, the Times Dispatch reports that “environmental groups and a clean energy trade association told the SCC that Dominion's electricity demand forecast is based on an unrealistic view.”   So, it remains to be determined whose forecast is correct. However, the New York Times reported Nvidia sales results soaring and sales jumping 101% year over year to$13.5 billion.  “Nvidia’s roaring sales contrasted sharply with the fortunes of some of its chip industry peers, which have been hurt by soft demand for personal computers and data center servers used for general-purpose tasks…

Some analysts believe that spending on A.I.-specific hardware, such as Nvidia’s chips and systems that use them, is drawing money away from spending on other data center infrastructure. IDC, a market research firm, estimates that cloud services will increase their spending on server systems for A.I. by 68% over the next five years.”

I would like to point out that crypto currencies are very different than deployment of A.I. and hyperscale data centers. In a demand response program, cryptocurrency miners simply shut down and get paid for stepping off the grid. At the right price they are happy to oblige the grid operators. The Texas grid is larger and the data centers represent a smaller proportion of the overall.  Data centers cannot as easily step off the grid, they must keep operating and require a viable backup. We are more and more dependent on the internet.  If the day comes when A.I. is operating much of our infrastructure control, automobiles etc. there will be no ability for the data centers to shut down in a demand response program. 

A recent Harvard Business Review article by Ajay Kumar and Tom Davenport stated that: “The data center industry… is responsible for 2–3% of global greenhouse gas (GHG) emissions. The volume of data across the world doubles in size every two years. The data center servers that store this ever-expanding sea of information require huge amounts of energy and water (directly for cooling, and indirectly for generating non-renewable electricity) to operate computer servers, equipment, and cooling systems. These systems account for around … 2.8% of the United States’ electricity use.

AI models are generated by “hyperscale” (very large) cloud providers with thousands of servers that produce major carbon footprints; in particular, these models run on graphics processing unit (GPU) chips. These require 10–15 times the energy a traditional CPU needs because a GPU uses more transistors in the arithmetic logic units. Currently, the three main hyperscale cloud providers are Amazon AWS, Google Cloud, and Microsoft Azure.”

What has happened decade after decade in the tech industry is that growth surprises, demand soars, back logs build, companies double or triple order to ensure their growth and then demand slows and sometimes crashes. In data centers, the demand may be not yet at the peak, past the peak, or about to take off for A.I. specialty data centers. My crystal ball is cloudy, but we do know that Virginia has given up control and management of the situation to the unrestrained approval for the building of data centers and the requirement the power be delivered on request. Which future will be ours- empty shells decaying along the road or a county crisscrossed by power lines energized by fossil fuels?

Wednesday, August 23, 2023

Fukushima Begins Releasing the Stored Water


water tanks at TEPCO Fukushima Plant

The Tokyo Electric Power Company (TEPCO) and the Japanese government announced on Tuesday that the operation to release the filtered and stored groundwater at the Fukushima nuclear plant would begin on Thursday and it did. (Today already happened in Japan.)

After staging a successful test on Tuesday, taking a sample of about 1 cubic meter of treated water and diluting it with about 1,200 cubic meters of seawater. The treated and diluted water was then tested to verify the treated water had been diluted as expected. The tritium concentration were measured to confirm that it is less than 1,500 becquerels per liter.

Now TEPCO will begin diluting large amounts of treated water from storage tanks and  releasing the diluted treated water into the ocean. The Japanese National Federation of Fisheries Cooperative Associations has continued to oppose the water release plan concerned about the impact the reputation of seafood from Fukushima and nearby areas. It was reported that despite all the preparation and investigation China issued a partial import ban on Japanese seafood by Hong Kong and Macau.

On March 11, 2011, a magnitude 9.1 earthquake struck off the northeast coast of Honshu, Japan, generating a deadly tsunami. Systems at the Fukushima nuclear plant detected the earthquake and automatically shut down the nuclear reactors. Emergency diesel generators automatically turned on to keep coolant pumping around the nuclear cores to try and keep them cool.

But soon after the tsunami wave which was over 46 feet high hit Fukushima. The water overwhelmed the defensive sea wall, flooding the plant and knocking out the emergency generators. Workers rushed to restore power, but in the days that followed the nuclear fuel in three of the reactors overheated and suffered a nuclear meltdown  where the nuclear cores were partly melted.

The Fukushima nuclear disaster released radioactive materials into the environment and forced thousands of people to evacuate their homes. Ever since 2011 crews have continuously pumped water through the destroyed reactors to keep the nuclear cores cool. In addition water flows naturally from the mountain towards the sea.

Approximately 150 tons of groundwater, which naturally runs from the mountain side to the ocean, flows into the reactor buildings cools the reactor cores and become newly contaminated water. Various countermeasures are taken (filtration to remove radionuclides) and storage to prevent  the contaminated water from flowing out to the port or that the contaminated water may leak from the storing tanks (secondary containment measures).

The Tokyo Electric Power Company (TEPCO) which owns the nuclear plant has been pumping, filtering and storing the water in tanks at the plant. Now, they say that they are running out of space to store the water on land. Last summer TEPCO obtained the approval of the International Atomic energy Agency (IAEA) for a plan to begin releasing the stored water into the Pacific Ocean. The plan is to release the stored water sometime this year.

IAEA Director General Grossi accepted Japan’s invitation and appointed a Task Force of independent experts and IAEA staff to carry out the three-pronged review – regulatory, technical and independent sampling and analysis – against international safety standards. These safety standards reflect an international consensus and serve as a global reference for protecting people and the environment from the harmful effects of ionizing radiation. In January the IAEA Task Force completed their second regulatory reviews in Japan.

No one is taking this lightly. The TEPCO crews have continued to pump the groundwater through the wrecked reactors to constantly cool the melted nuclear fuel. This cooling water picks up radiation in the form of radio nuclides. The water is then passes through a specialty filtering process to remove and capture much of the radiation, but the process does not effectively capture tritium because tritium forms water molecules and no filtration process is perfect. Tritium is a hydrogen atom that has two neutrons in the nucleus and one proton. Though produced naturally in the upper atmosphere, Tritium is also produced as a byproduct in nuclear reactors and nuclear explosions.

TEPCO will gradually release up to 22 trillion becquerels of tritium per year from the Fukushima Nuclear Power Station over the next 20 or 30 years. The level of tritium in the water that will be released from the Fukushima Nuclear Power Station is below the maximum amount of tritium in drinking water recommended by the World Health Organization (10,000 becquerel per liter).  Tritium has a 12 year half life and gives off only low-energy beta particles that are believed to pose limited risks for marine life and humans. However, there are limits to the ability of the Ocean to sustainability dilute the concentration of residual contamination. Tritium levels will be monitored and reported on the TEPCO website.

Sunday, August 20, 2023

The Eroding Financial Strength of the US

 The United States has set a goal to reach 100 % carbon pollution-free electricity by 2035 and net zero emissions throughout the economy by 2050. The President also pledged an interim goal of a 50-52% reduction from 2005 levels in economy-wide net greenhouse gas pollution by 2030. These goals are pipe dreams as much of the promises made under the Paris Agreement seem to be. The U.S. Energy Information Administration, EIA, is forecasting that by 2030energy-related CO2 emissions fall to 25% to 38% below 2005 levels. 

The window for limiting global temperature to 1.5 degrees Celsius has probably closed. CO2 emissions from fuel have continued to grow year after year with the exceptions of a brief respite during the global financial crisis and the Covid-19 lockdowns. World CO2 emissions have resumed their climb. Total CO2 emissions for planet earth reached 40.6 billion tonnes of CO2 (GtCO2) in 2022.

“The scientific community has made clear that the scale and speed of necessary action is greater than previously believed.  There is little time left to avoid setting the world on a dangerous, potentially catastrophic, climate trajectory.  Responding to the climate crisis will require both significant short-term global reductions in greenhouse gas emissions and net-zero global emissions by mid-century or before.”

“It is the policy of my Administration to organize and deploy the full capacity of its agencies to combat the climate crisis to implement a Government-wide approach that reduces climate pollution in every sector of the economy; increases resilience to the impacts of climate change; protects public health; conserves our lands, waters, and biodiversity; delivers environmental justice; and spurs well-paying union jobs and economic growth, especially through innovation, commercialization, and deployment of clean energy technologies and infrastructure…” 

So, our nation is on a mission to reduce the greenhouse gas emissions of our natin and prepare for and respond to climate change. Climate change is also increasing the frequency  and supercharging the  intensity of drought, lengthening wildfire seasons in the Western states, and the potential for extremely heavy rainfall becoming more common in the eastern states. Sea level rise is worsening hurricane storm surge flooding. In 2022, the U.S. experienced 18 separate weather and climate disasters costing at least 1 billion dollars with the total reaching $165 billion.

from NOAA

According to NOAA, the number and cost of weather and climate disasters are increasing in the United States due to a combination of increased  population and material wealth over the last several decades exacerbated  by the fact that much of the growth has taken place in vulnerable areas like coasts, the wildland-urban interface, and river floodplains.

The problem is that as a nation we are spending our way into poverty and reducing our ability to afford to respond to disasters and protect our citizens. Since the global financial crisis in 2008 when our national debt stood at $13.6 Trillion it has ballooned to $32.7 Trillion today. In 2007 before the global financial crisis our annal budget deficit was $0.16 Trillion, the following year it ballooned to $0.45 Trillion, and just simply never returned to the pre 2008 levels. In 2020 the budget deficit was $3.13 Trillion dollars, currently it is $1.61 Trillion.  Our debt has exceeded our GDP. 




Wednesday, August 16, 2023

Carbon Offsets and Regulators

Carbon credits or offsets and their related markets provide tools for tools for organizations seeking to reduce their carbon footprint and comply with both voluntary and mandatory emissions reduction goals. Carbon credits or offsets represent carbon emission reductions or removal and are traded in on various exchanges or markets. 

Carbon markets allow carbon emitters to purchase credits that are awarded to projects that remove or reduce atmospheric carbon.  These credits offset their emissions to reach their voluntary commitment to reduce “net” emissions. Each carbon credit typically corresponds to one metric ton (tonne) of reduced, avoided or removed carbon dioxide or equivalent greenhouse gas. 

Offsets are a popular tool that companies use to reduce their net greenhouse gas (GHG) emissions and live up to their environmental, social and governance (ESG) goals, as well as promises made to customers and consumers. By purchasing carbon offsets, businesses believe they are financing renewable energy projects that remove GHG emissions from the atmosphere or avoid GHG emissions – such as commitments to preserve forests or the construction of facilities to capture carbon emissions – without being involved directly in these projects.

These voluntary markets can be distinguished from “compliance” carbon markets, which is the term for systems where a government or regulator issues a carbon allowance that participants must not exceed unless they can purchase additional compliance allowances from another participant under the cap-and-trade program.

The Inflation Reduction Act (IRA) is a far-reaching law includes provisions to “finance green power, lower costs through tax credits, reduce emissions, and advance environmental justice.” The law states that the IRA is intended to reduce U.S. carbon emissions by roughly 40% by 2030 and to reach a net-zero economy by 2050. The passage of the IRA has inspired greater regulatory scrutiny, or the carbon credit markets to avoid greenwashing.

Commodity Futures Trading Commission (CFTC), Securities and Exchange Commission (SEC) and the Federal Trade Commission (FTC) have all proposed updated rules to address deceptive claims about the use of carbon offsets. The CFTC is exerting jurisdiction over fraud and manipulation in "physical" carbon markets and recently created an Environmental Fraud Task Force. The FTC has proposed updates to its Green Guides to address deceptive claims about the use of carbon offsets.

The CFTC recently created an Environmental Fraud Task Force to examine fraud and other misconduct in regulated and voluntary carbon markets. In particular, the CFTC is interested in:

  • manipulative and wash trading or other violations in carbon market futures contracts
  • fraud in markets related to ghost or illusory carbon offsets listed on carbon market registries
  • double counting or other fraud related to carbon offsets when the same offset is claimed by more than one entity without an additional carbon benefit
  • fraudulent statements relating to material terms of the carbon offset
  • manipulation of tokenized carbon markets
  • Fraudulent claims that offsets are in addition to any reductions that would have occurred in a business-as-usual scenario or as required by law.

The FTC will use its broad statutory authority over unfair and deceptive practices with respect to environmental claims. The FTC is finalizing standards for the “Use of Environmental Marketing Claims” to provide clarity and stricter guidance for claims made by using carbon offsets that products or businesses are carbon-neutral, have net‑zero emissions, or are low‑carbon or carbon-negative.

There is currently no legal requirement that companies verify the quality of offsets used to make climate claims, although many do voluntarily verifying through organizations such as the Integrity Council for the Voluntary Carbon Market (ICVCM). 

Sunday, August 13, 2023

The Cold tongue, El Nino, and climate change

Two years ago, the U.N. Intergovernmental Panel on Climate Change (IPCC) released their latest report on Climate Change   in which the IPCC greatly narrowed the likely future temperature rise. Nonetheless, emissions of carbon dioxide have been rising by about 1% per year on average for the past decade (with a slight pull back during the pandemic). Though, renewable energy use has been expanding rapidly, much of the renewable energy is being deployed alongside existing fossil energy, not replacing it.  

All the climate models tie the rise in global temperatures to concentrations of atmospheric carbon dioxide and this is still happening. The planet has warmed 1.1 degrees C since the late 19th century and is expected to warm an additional 0.4 degrees C in the next 20 years. Just to make this point, this past July has been reportedly the warmest month in history.

Mankind in their burning of fossil fuels and covering the earth with concrete is responsible for this rise in temperature. No actions that nations are likely to take can change this trajectory we only have some hope of moderating it.  That was the key finding of the IPCC report. Total CO2 emissions for planet earth have passed 40.6 billion tonnes of CO2 (GtCO2) per year.

The Pacific, the largest ocean on earth, has a surface larger than all the continents combined. The weather on the planet is impacted by the El Nino Southern Oscillation where the winds across the Pacific Ocean move the planet between La Nina and El Nino conditions every few years and the Decadal Oscillation a much longer pattern that appears to happen over 20-30 years.

As pointed out in an excellent article by Madeleine Cuff last week in New Scientist. The massive climate models created to model our plant predict that as a result of climate change, the surface of the Pacific Ocean should be warming, but hidden in the natural large variability of the Pacific Ocean is an oddity.  Between 1980 and 2022 the planet’s sea surface temperatures increased faster than the Earth’s surface temperature. However, there is an area in the eastern Pacific Ocean emanating from the coast of south America that has been cooling defying all the models. That area is known as the cold tongue.

While the eastern Pacific Ocean has always been (as far as we know) cooler than the western Pacific Ocean, this difference has increased by about 10% (about half a degree Celsius). This oddity may have significant impact on how quickly the planet warms and the resultant weather patterns. If the eastern Pacific Ocean were to suddenly flip to a warming pattern, this could change the base state of the climate to La Nina conditions. Climate resilience plans would need to change to respond to unanticipated extensive and permanent drought in the U.S. Southwest and the Horn of Africa.

Scientists do not understand the cause of the cold tongue and how long it may continue. It matters, in understanding the future planet climate conditions and how severe they are. If it continues the cold tongue could reduce the anticipated global warming by about 30% according to sone researchers. Many efforts have been made to reconcile the discrepancies between climate model projections and real world observations (Solomon and Newman 2012, L'Heureux et al 2013a; Luo et al 2018, Chung et al 2019) with various theoretical arguments. 

Researchers at Lamont-Doherty Earth Observatory at Columbia University have been studying the Pacific Oscillation and climate modeling. Their work indicates that not all climate models include Antarctic meltwater in their calculations and some have trouble correctly reflecting changes to sea temperatures, winds and currents in the Southern Ocean. As a result, warming projections for this century by current global climate models may be overestimated. That would be good news.

Other scientists believe that the current climate models will be proven correct eventually. The eastern Pacific will eventually flip to warming because of all the greenhouse gasses mankind continues to pump into the atmosphere. Now, to try and solve the mystery an international working group formed this year to study the cold tongue. To predict what will happen next, we first need to understand what is happening now.

Read the research from Lamont-Doherty Earth Observatory, the linked studies above and the full article in New Scientist if you are interested in learning more.

Wednesday, August 9, 2023

Virginia SCC approves a Demand Response Program for Dominion

The Virginia State Corporation Commission (SCC) has approved Dominion Energy request for five new "demand-side" management programs intended to be elements in the utility’s efforts to maintain the stability of the grid as they decarbonize and to reduce the cost on customers.

from SCC Dominion Energy VA typical bill

According to a report in the appendix of the SCC hearing, these new programs expands on the existing pilot program that allowed 10,000 customers. This was called an off-peak plan and  was originally approved in 2021. It used a “Time Of Use” rate that charged 10,000 participating customers more for electricity during peak hours while saving them money during times when people don’t traditionally use energy. That program saved customers on average $17 a year. In addition, customers used 9.4% less energy during peak times in the summer and 2.9% less during the winter. 

However in the report, what were termed “high-baseline” customers, or those who must reduce energy during peak times to save money, saw their bills increase. Meanwhile, “structural winners,” or those who simply benefit from the rate changes without having to alter their habits, experienced all the savings. As the Attorney General’s Office pointed out in a letter to the SCC, that may prove to be an unsustainable design- encouraging only structural winners to participate while those customers with the greatest potential for load shifting abstain, to avoid higher costs.  With growing electric bills, people could also learn to adjust their power usage to save money.

According to  Dominion Energy’s web site they will expand program to reward participating business customers for reducing their electricity use during times when our power grid is experiencing heavier than normal use.

Our Non-Residential Distributed Generation Program partners with interested customers to switch their power source from the Dominion grid to a backup generator for a limited number of hours each year. In return, the customer receives a monthly incentive payment based on their reduced power consumption.”

The program will provide participating customers a monthly incentive to allow their on-site backup generators to be remotely activated by PowerSecure during load curtailment events (a "control event") for up to a total of 120 hours per year. The monthly incentive is based on the amount of load curtailed, the amount of fuel consumed during load curtailment events, and/or tests requested by Dominion Energy Virginia during the month.

Monthly Participation Payment = Load Curtailment Capability Payment + Fuel Payment + Variable Operations & Maintenance Adder

It seems that slowly, but surely Dominion is moving towards having large industrial flat demand users (data centers) becomes dispatchable micro grids to protect the main PJM grid. Microgrids are self-contained electrical networks that draw from on-site energy sources. These supplement grid power to keep the data center online in the case of a shortfall of power to prevent a grid outage. Diesel backup generators  have been the norm in Virginia. However, diesel is the dirtiest source of power for a micro grid and frankly after this summer of awful air I want better. As the backup power for the data centers becomes the emergency dispatchable power for our region we must have cleaner power. We can no longer allow diesel generators to be the backup power for data centers. This dispatchable power must be natural gas, fuel cells, or other energy storage combined with on-site storage.  

Sunday, August 6, 2023

Digital Gateway Rezoning is Coming

 On November 1, 2022, the Prince William Board of County Supervisors adopted Comprehensive Plan Amendment for the PW Digital Gateway. They did this without performing a watershed study as requested by Fairfax County and Fairfax Water.  The Digital Gateway Development could very likely endanger the Occoquan watershed, the most urbanized watershed in the nation and currently experiencing degradation; and the Occoquan Reservoir, the source of water for 800,000 Prince William and Fairfax residents. The data center companies and landowners and seemingly the democratic majority of our Prince William County Supervisors could not care less. To move forward, the proposed data centers have to be rezoned and get a Special Use Permit, SUP.

Now, three rezoning applications have been submitted within the area of PW Digital Gateway Comprehensive Plan Amendment, Digital Gateway North, Digital Gateway South, and Compass Datacenters. The proposals also request a waiver to requirements that the proposed data centers must have an approved SUP. The rezoning request lacks a detailed layout of the site, despite detailed requests for additional information from PWC staff. The proposed rezoning requests are too general and do not provide sufficient details to even determine the location of site features and resource protected areas under the Chesapeake Bay Protection Act. It appears that the data centers propose using the RPA as the path for the power lines. That is forbidden by Virginia Law- the Chesapeake Bay Protection Act.  

For waiver of the SUP requirement, PWC staff would need the same level of detail in the rezoning request as would be required with the SUP, and that all relevant impacts should be appropriately mitigated to protect our water, our grid and our citizens as a SUP would. Yet, the data center developers think that on such a large and complicated site, asking for this level of detail is unreasonable. They failed to submit the requested information, address the impacts to the properties to the west, or develop adequate mitigations to impact to the Battlefield and historical resources.

Data centers are the physical factories of the internet. Standard data centers are warehouses filled with row upon row of servers, routers, wires, and other information technology hardware spanning hundreds of thousands of highly cooled square feet per building and sucking up incredible amounts of power. Now we have the emerging demand for AI data centers. These specialized data centers run high performance chips like the Nvidia graphics processing units that use seven times the power of traditional data centers. This requires additional power infrastructure, and the extra power generates more heat and requires liquid cooling to prevent the equipment from overheating. The applicants have not submitted the maximum daily water demands and peak wastewater flows for each phase of development, so the hydraulic capacity studies by the PW Service Authority cannot be completed.

The intension to migrate these 2.5 million square feet of data centers to AI might be in the proposal for water usage. The comment from PWC staff: “The CPA policies encourage efficient water usage for data center development using closed loop water, or no water-cooling systems. What is proposed is not consistent with the CPA policies.” Furthermore, the Applicants are proposing increasing the height limits to 100 feet for the building and not counting the roof mounted cooling equipment in that height calculation. Though, the view shed analysis was done using a lower height and only done for points in the Battlefield not the western property boundary.  

Though the rezoning request provides numerous notes and labels throughout there are no assurances that proposed improvements, standards, and locations of these items will be complied with. The Applicant describes items as “typical” rather than specific for the plan.  In addition, substation locations shown are “approximate.” Limits of development shown are “approximate.” Wildlife crossings shown are “approximate.” Basically, the data center Applicants are saying trust us. It’s a blank check and they can don anything they want -endangering our environment, water supply and our power supply.

Last spring Dominion Energy released their Integrated Resource Plane (IRP) which details how it plans to meet electricity needs and demands of their customers over the next 15 years. The picture they painted is that Dominion cannot both meet the power demand of the exploding number of data centers in Virginia and the mandates of the Virginia Clean Economy Act (VCEA). The 2020 VCEA is the state’s law outlining a path to decarbonize the electric grid by 2050. With the specter of seven times the power usage Dominion may not even be able to meet the power demand.

Dominion Energy Virginia does not produce all the electricity it delivers and sells. Dominion Energy is a member of PJM Interconnection, LLC (PJM), the regional transmission organization coordinating the wholesale electric grid in the Mid-Atlantic region of the United States. PJM other members supply a significant amount of the energy used in Virginia- about a fifth. PJM recently identified increasing reliability risks due to both the growing demand for power in Virginia and the profile of that power demand and from the premature retirement of dispatchable carbon generation facilities across the region.

The load growth of the Data Center in Northern Virginia is according to PJM will grow at 7% annually. That means that electricity demand will more than double over the next 10 years for traditional data centers only, it will be multiples of that for AI data centers.

In the rezoning request for the Digital Gateway the Applicants treated the specific details requested by staff to ensure that the plans comply with federal and state law, regulations and rules was too much on such a large scale development. They ignored them. These rules and regulations exist for a purpose-to protect our water supply, our grid, our environment and us. The details are necessary.  Afterall the devil’s in the details.

Wednesday, August 2, 2023


I know that I refer to the PJM fairly often lately, but its been quite a while since I covered the basics power, generation and the PJM. The electricity that powers our lives- charges our phones, powers the internet, equipment, lights, homes, office,  air conditioning and soon everything else is there when we need it because of the power grid, an interconnected system that keeps electricity flowing to our homes and businesses every moment of every day. PJM Interconnection, a regional transmission grid operator, works behind the scenes to ensure the reliability of the power grid and to keep the lights on.

PJM began in 1927 when three utilities formed the world’s first continuing power pool to gain the benefit from combining their resources. These utilities connected to each other through high voltage transmission lines, creating an interconnection of resources. Interconnection is a two-way street allowing those who are connected to the grid to take or give power, share resources back and forth as needed. The PJM grid once covered Pennsylvania, New Jersey and Maryland (the source of the initials PJM); but today, the PJM extends into13 states and the District of Columbia: Delaware, Illinois, Indiana, Kentucky, Maryland, Michigan, New Jersey, North Carolina, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia.

PJM is a regional transmission organization, RTO; and is an independent system operator (ISO) that takes responsibility for grid operations, reliability, and transmission service within their defined geographical region. The North American transmission grid includes all of the United States and most of Canada. It is made up of the nine major RTOs/ISOs. To ensure that these organizations operate efficiently and reliably, the United States government empowers the North American Reliability Corporation (NERC) to establish and enforce performance standards for all of them, except for the Texas grid, which the state of Texas regulates separately.

The large size of PJM’s market area increases the diversity of resources available to meet consumer need, providing benefits to both market participants and consumers and enhancing the reliability of the grid. Power pooling provides PJM members the benefit of drawing from electricity resources across a broad geographic area, weather (not everyone is hit with the same storms), and generation sources. This means that if one area is short on resources, resources can be brought in from a different area, to ensure grid reliability.

While some large-scale battery storage options are now available, many are reserved for emergency situations; the vast majority of electricity must be used when it is generated. For this reason, PJM must have generators up and running to supply electricity when outdoor temperatures soar and customers “demand” more power to run air conditioners, or when utilities restore power to a large number of customers after a major storm. Thus, the charts I have occasionally put up of PJM’s forecast of power needs for the next 24-48 hours. PJM forecasts the demand and makes sure the power generation is available to meet that need.