Wednesday, November 29, 2023

COP 28

The United Nations Climate Change Conference – COP 28 will begin today November 30, 2023 in Expo City, Dubai in the United Arab Emirates (UAE). The conference is scheduled to run until December 12, 2023. Pre meetings have been ongoing since November 24th.  President Biden will not be attending. The United States will be represented by Special Envoy Kerry.

Officials are expecting some 70,000 climate advocates, diplomats and other green groupies will attend the event in Dubai, one of the gleaming modern cities of the middle east built on wealth that fossil fuels have brought to the region. The fact that the world’s most important climate gathering will be hosted by a leading oil producer has sparked outrage among environmentalists and I expect a certain showmanship in the protests. Most people will be restricted to the “green zone” where all the climate theatrics will take place. Access to the blue zone (and true participation in the meeting) is restricted to delegates, admitted observer organizations and accredited members of the press and media. Delegations from all 199 Parties to the UN Framework negotiated in Paris in 2015 are expected to attend.

Under the 2015 Paris Agreement, every country agreed to work together to limit global warming to well below 2 degrees and aim for 1.5 degrees, to adapt to the impacts of a changing climate and to make money available to deliver on these aims to countries not able to afford the costs of adapting to a changing climate. The parties to the agreement committed to create national plans setting out how much they would reduce their emissions called Nationally Determined Contributions (NDC) and agreed that every five years they would update their plans.  

The achievement by a party of its NDC is not a legally binding obligation, nor is a country bound to any particular policies to achieve its target. It can, at any time, revise those targets and policies without legal ramifications. 


from the Global Carbon Project


The problem is that China's CO2 emissions are now more than 230% of the United States and growing rapidly. India's emissions are almost equal to the 27 members of the European Union. Neither nation has any plans to reduce emissions. The United States (by executive order and administrative action) has set a goal to reach 100% carbon-free electricity by 2035 and net zero emissions throughout the economy by 2050. The President by Executive Order also pledged an interim goal of a 50-52% reduction from 2005 levels in economy-wide net greenhouse gas pollution by 2030. 

from the Global Carbon Project

Though China has far surpassed the current CO2 emissions of the United States, our per capita carbon footprint is still one of the highest on earth.

The U.S. carbon emission target is ambitious and would require a carbon-slashing overhaul of the U.S. economy.  The EIA is forecasting that we will not achieve that goal.  Despite the fact that the carbon emissions have been generally trending down since 2005 there is no pathway to reach the 2030 goal. The misleadingly named Inflation Reduction Act (IRA) includes many programs designed to remake the climate impact of the entire U.S. economy. The Congressional Budget Office assigned $391 billion cost to climate programs based on their estimate of the spending from the law's programs related to climate and clean energy. It is a very loose estimate because the law's major environmental tax incentives have no caps. Several investment banks have estimated the true cost of the environmental programs at near or more than a trillion dollars or more. Implementation of the law and its impact will matter. 

Here at home the energy needs of the Commonwealth are changing and growing. Virginia is already the data center capital of the world, and the industry is exploding along with the demand of 24 hours a day 7 days a week power needed to run them. According to Dominion Energy the demand for electricity in Virginia is growing at 7% a year to power the data centers. At the same time under the Virginia Clean Economy Act (VCEA) has the Commonwealth on a short timeline to decarbonize the grid and electrify transportation and heating.

Dominion Energy in its 2023 Integrated Resource Plan (IRP) filed with the SCC this past summer did not have a viable pathway to decarbonize the grid. The picture they paint is that Dominion cannot both meet the power demand of the exploding number of data centers in Virginia and the mandates of the Virginia Clean Economy Act (VCEA). The United States as a whole and Virginia has a mismatch in goals and actions. The time for magical thinking and greenwashing is past.  


Sunday, November 26, 2023

Planting Giant Sequoias Sparks Controversy

Giant Sequoias are conifer trees, which can live for 3,000 years and grow to 300 feet tall and 30 feet wide at their base.  These towering trees grow naturally only on the western slopes of the Sierra Nevada mountains in California. Although they have evolved to be wildfire-resistant, they have suffered greatly in the wildfires of the last 6 years. The giant sequoias are very thick-barked which provides partial immunity to fire and even relies on fire for reproduction; however, scientists say the warming climate has made wildfires worse and deadlier for the trees.

from the USFS

Misguided forestry practices, which sought to suppress beneficial small and moderate fires allowed woodlands to become overly dense and ended up fueling the conflagration. Fire is essential to giant sequoias. Tree-ring records from giant sequoias show that frequent surface fires were the typical pattern of fire occurrence over the past 2,000 years. But this pattern changed after about 1860, when fire frequency declined sharply. During the century from the late 1800s until the late 1900s, fire was rare in many giant sequoia groves due to land use changes and many decades of fire suppression.

Giant sequoias have coexisted with fire for thousands of years. Their thick, spongy bark insulates most trees from heat injury, and the branches of large sequoias grow high enough to avoid the flames of most fires. Also, fire’s heat releases large numbers of seeds from cones, and seedlings take root in the open, sunny patches where fire clears away groundcover and kills smaller trees. But starting in 2015, higher-severity fires have killed large giant sequoias in much greater numbers than has ever been recorded.

Six fires, occurring between 2015 and 2021 killed many large sequoias in numerous groves across the Sierra Nevada. More than 85% of all giant sequoia grove acreage across the Sierra Nevada has burned in wildfires between 2015 and 2021, compared to only one quarter in the preceding century. The Forest Service believes we have reached a tipping point — lack of frequent fire for the past century in most groves, combined with the impacts of a warming climate — have made some wildfires much more deadly for the sequoias.


As part of a multi-year project to improve forest health through reforestation, Sequoia National Forest personnel worked closely with the national nonprofit American Forests to plant over 286,000 trees across 1,380 acres. This included over 14,000 giant sequoia seedlings. While new trees may regenerate on their own after a wildfire,  this cannot happen after high-severity burns when all the overstory trees are dead. In such cases, few, if any, green trees remain. Burned seeds on the forest floor are often unable to develop after experiencing such high temperatures. This is when planting becomes necessary to retain a forest landscape.

Also,  debris and fallen trees need to be cleared from the forest floor to allow seedlings to grow in bare soil, and clumps of burned trees that remain standing need t be removed from some areas. These burnt out trees are structurally weak and may fall soon, endangering the next generation of trees or the forest workers. According to the Forest Service, it is important to not wait too long to plant. If the landscape remains deforested, brush species that thrive in disturbed areas, will quickly carpet the forest, absorbing the moisture and space seedlings need to thrive. Many park land managers worry about vegetation type conversion, which follows high-severity fire and occurs when forested lands transition to shrublands. This can cause an ecosystem shift and is associated with a loss in biodiversity.

Now after two months of planting, a handful of conservation groups filed suit to stop the work last week. The groups contend that the reforestation project, which entails planting tens of thousands of sequoia seedlings on charred hillsides in Sequoia and Kings Canyon national parks, is inappropriate because the burned areas are designated “wilderness,” where human intervention is prohibited. Contrary to what park officials and the Forest Service say, the litigants assert that replanting trees is not needed for the groves to successfully regenerate.

Wednesday, November 22, 2023

Hydrogen Fuel from Waste Plastic

The following is excerpted from a Rice University news release and a MIT news release:

In the early 21st century vehicles using hydrogen-powered fuel cells rivaled electric vehicles with batteries (EVs) as the best way to decarbonize the car industry by replacing gasoline. Today, EVs are way ahead and IRA has clearly chosen their winner- EVs. The big car companies are trying to rapidly electrify their vehicle offerings, but are facing resistance from the consumer. On of the resistant is my husband who really wants a hydrogen car.

Research at MIT found that the lifetime cost of ownership for a fuel cell car has come down in recent years, but remains higher than EVs largely because of the cost of hydrogen fuel. The researchers found the total cost of ownership for hydrogen was around 40% higher than a comparable gasoline vehicle, and about 10% higher than an EV.

EVs have another crucial advantage over hydrogen. There already exists a vast nationwide electrical system. A nationwide transition to electric vehicles creates big challenges, including the need to build a charging network and make plenty of extra electricity to power all these cars and trucks. 

Hydrogen has its own advantages. The fuel can be pumped in less time than it takes to charge an EV battery, and it can deliver longer driving ranges more in line with gasoline cares. Hydrogen more closely resembles the pump-and-go experience everyone knows from using gasoline. However, that experience would require creating an enormous amount of hydrogen and then moving itto refueling stations all over the country. 

Innovations to make hydrogen cleaner and cheaper could help make fuel cell vehicles competitive once again and possibly more desirable. Until now the methods used to make hydrogen it either generate too much carbon dioxide or are too expensive. Green hydrogen, produced using renewable energy sources to split water into its two component elements, costs roughly $5 for just over two pounds.

"The main form of hydrogen used today is 'gray' hydrogen, which is produced through steam-methane reforming, a method that generates a lot of carbon dioxide" said James Tour, a materials scientist. Most of the nearly 100 million tons of hydrogen used globally in 2022 was grey hydrogen derived from fossil fuels, and its production generated roughly 12 tons of carbon dioxide per ton of hydrogen.

Recently researchers from Rice University (James Tour and Kevin Wyss) have found a way to harvest hydrogen from waste plastic using a low-emissions method that could more than pay for itself.

The researchers converted mixed waste plastics into high-yield hydrogen gas and high-value graphene. The researchers exposed plastic waste samples to rapid flash Joule heating for about four seconds, bringing their temperature up to 3,100 degrees Kelvin. The process vaporizes the hydrogen present in plastics, leaving behind graphene — an extremely light, durable material made up of a single layer of carbon atoms.

"When we first discovered flash Joule heating and applied it to upcycle waste plastic into graphene, we observed a lot of volatile gases being produced and shooting out of the reactor," Wyss said. "We wondered what they were, suspecting a mix of small hydrocarbons and hydrogen, but lacked the instrumentation to study their exact composition."

"We know that polyethylene, for example, is made of 86% carbon and 14% hydrogen, and we demonstrated that we are able to recover up to 68% of that atomic hydrogen as gas with a 94% purity," Wyss said. The scientists hope that this work will allow for the production of clean hydrogen from waste plastics, possibly solving both the major environmental challenge of plastic pollution and the greenhouse gas-intensive production of hydrogen by steam-methane reforming.

Sunday, November 19, 2023

Water and Data Centers

We all know that data centers use huge amounts of electricity to power their millions upon millions of chips. However, data centers also use large amounts of water for cooling systems, which ensure that the heat produced by these massive facilities is controlled.

Data center cooling systems use large amounts of water to operate. These systems includes cooling towers, chillers, pumps, piping, heat exchangers / condensers, and air conditioner units in the computer rooms. Additionally, data centers need water for their humidification systems (to avoid static discharges) and facility maintenance. Data centers are either water-cooled or air-cooled, with water-based cooling using evaporative cooling systems more common, particularly for large data centers simply because it is more efficient and effective. Direct contact cooling systems using evaporation can remove and release all of the heat produced inside a data center from the servers and other IT equipment.

In a water-cooled system, water-cooled chillers and cooling towers located on top of the data center roofs produce chilled water, which is delivered to computer room air conditioners for cooling the entire building. In 2021, when Prince William County looked at water consumption for its 25 operational data centers at the time it found that water use varied by season and ranged from about 0.2 to 0.5 gallons per square foot per day. The data centers that Prince William looked at were all relatively small 100,00-250,000 square feet- nothing like the hyper centers being built now. Today, data centers seem to start at a million square feet and move up from there with multiple building campuses.  How water use scales up in multi-story data centers is unknown.

For cooling purposes, data centers typically use potable water, on-site groundwater, or surface water, and rainwater capture systems. Prince William county believes that most data center water comes from potable water supplies. In Loudoun, to some extent, they source non-potable / recycled water, which is treated sewage. In Prince William the treated sewage from UOSA is already used by Fairfax Water to supplement water supply to the Occoquan Reservoir for our drinking water supply.

Water used to cool data centers is either consumed, meaning it evaporates into the atmosphere via the data center’s cooling towers or is discharged, as industrial wastewater, usually to a local wastewater treatment plant. Effective water treatment, either on-site or off-site through a wastewater treatment plant, means that the water can be reused in the cooling system several times, if the water quality (e.g., hardness) is acceptable. Of course, softening the water could help, however, it brings ,pre salt into the waste water.

Data centers reuse water by recirculating the same water through their cooling systems multiple times while replenishing what evaporates. According to Google, this practice saves up to 50% of water when compared with “once-through” cooling systems. However, eventually this reused water needs to be replaced with new water, due to mineral scale formation which will damage the cooling equipment or once the conductivity of the water is too high which could damage the IT equipment. The need for new water results from the build-up of calcium, magnesium, iron and silica, which become concentrated over multiple evaporative cooling cycles.

Amazon Web Services (AWS), Google and others have committed to  being “water positive by 2030,” returning more water to communities and the environment than it uses in its direct operations. This is a very interesting concept. You see, there is no mechanism on Earth for creating or destroying large quantities of water. All the water we have is what's been here, literally, forever- since the planet was formed 4.5 billion years ago. Of all the water on earth only about  3% is fresh: however, only ½% of the water on earth is available for mankind to use. The rest of the fresh water is locked away in ice, super deep groundwater or polluted beyond redemption.

Where exactly are Amazon and Google going to get this excess water they plan to return more of than they use to communities. Obviously, they would have to take it from another watershed or somewhere else. Water is a zero sum game here. No one is making water. The available supply of fresh water is continually renewed by the hydrologic cycle or artificially. Rain drops falls fall to earth and will evaporate, infiltrate into the soil, recharge groundwater or flow along the ground to a stream and ultimately flow into rivers and to the ocean-moving always moving. That is the most basic description.

Building data centers can interfere with the hydrologic cycle.  Covering once open wooded areas with impervious surfaces reduces the recharge of groundwater which impacts stream flow. Changing the use of the land, covering it with buildings, driveways, roads, walkway and other impervious surfaces will change the hydrology of the site reducing groundwater recharge in the surrounding area increasing stormwater runoff velocity and quantity. Once the hydrology is destroyed by development, it cannot be easily restored, if at all.

The Occoquan Reservoir is fed by the Occoquan River which receives up to 40 million gallons a day of the treated discharge of the Upper Occoquan Sewage Authority treatment plant, UOSA.  which  discharges to the river upstream of the Occoquan Reservoir.  A significant portion of the flow (especially during dry periods) into the reservoir is recycled sewage. This treated wastewater is from areas supplied by the Potomac River or lake Manassas so you do not end up with constantly recycling and concentrating the same impurities into the drinking water supply.

In addition, the Occoquan Reservoir receives stormwater runoff, precipitation from the Occoquan Watershed which covers portions of Loudoun, Fairfax, Fauquier, and Prince William counties and feeds the streams and creeks that feed Bull Run and the Occoquan River. Obviously, the Amazon Web Services (AWS) commitment “to be water positive by 2030, returning more water to communities and the environment than it uses in its direct operations” means that either they are incredibly naive about water or plan to take water from another place or disrupt the hydraulic balance to fulfill this pledge.

 

Wednesday, November 15, 2023

Virginia Needs to Manage the Data Centers

Electricity demand typically inches higher slowly with both economic growth and population growth, less any gains in efficiency. Nationally, electric sales grew just about 5% in the past decade. However, electric growth has been surging in some areas.

 

from Dominion IRP 2023

Data centers are one of the biggest new electric  power consumers, and demand from them could double by 2030. Some new data centers that have been requesting grid connections from Dominion are as large as 500 megawatts. That is as much as it takes to power hundreds of thousands of homes according to the  Electric Power Research Institute, a nonprofit researcher and advisory. 

In Virginia, Dominion Energy, the state’s largest utility, has connected 75 new data centers since 2019. Statewide electricity sales are up 7% so far this year which is not quite over. According to Dominion’s  IRP, they expect that electric demand to grow by about 85% over the next 15 years. This is unpresented growth in electric demand.

 

from Dominion IRP 2023

While that is happening, Virginia needs to meet the requirements of the VCEA. In 2020, the General Assembly passed the Virginia Clean Economy Act (VCEA), which mandated a goal of 100% zero-carbon energy generation by 2050 and prescribed increasingly strict Renewable Portfolio Standards (RPS) for Virginia's investor-owned electric utilities.

Under the VCEA is required by law to make the transition from conventional power plants fueled by coal and natural gas to cleaner forms of energy such as wind and solar. Dominion is also required to provide electricity to Grid operators across the U.S. have been warning that power-generating capacity is struggling to keep up with demand for a transition to electric and digital industrialization. The resulting  gaps  in supply could lead to rolling blackouts during hot or cold weather extremes.


While many data centers have made clean energy and sustainability commitments, there is no way to clearly evaluate these claims or ensure they are aligned with the VCEA’s  plans to decarbonize the grid due to non-disclosure agreements and general secrecy around the industry. However, it is clear from Dominion’s 2023 IRP the the power for the data centers is coming from elsewhere- power purchases. As seen in the chart above the states in the PJM with power to sell are West Virginia and Pennsylvania. That power has to be delivered to Virginia and  the public is being asked to subsidize new transmission infrastructure and bear the burden of the compromised viewsheds and land seizures as well as compromise on Virginia’s clean energy and conservation goals in order to meet the massive electricity demand caused by one private industry.

Zoning and land use have always been left up to the counties. It is clear that there is no limit to the desirability of data centers to county supervisors and landowners. The counties of Prince William and Loudoun and increasingly our neighbors have been blinded by the windfall profits to the landowners and the prospect of increased tax revenue, but these windfall profits come at the cost of the data centers’ power demand flat profile, the need for expanding the grid. We have granted data center companies  control over our environment. We are now getting ready to build more gas fired electricity generation facilities to serve them. The data centers will degrade our land and water resources and increase power and water costs for all Virginians.

Sunday, November 12, 2023

Your Water Contains PFAS. Now What?

My younger brother lives in Massachusetts where in 2020, the department of environmental protection (MassDEP) published its PFAS public drinking water standard or Massachusetts Maximum Contaminant Level (MMCL) of 20 nanograms per liter (ng/L), or parts per trillion (ppt) for the sum of the concentrations of six specific PFAS. The six PFAS are: PFOS, PFOA, PFHxS, PFNA, PFHpA, and PFDA. MassDEP abbreviates this set of six PFAS as “PFAS6” and has all public water systems test for them. 

This list of PFAS6 is from the national UCMR 3 (an US EPA  Safe Drinking Water program used to identify emerging contaminants of concern). Roughly 5,000 water systems monitored  for six PFAS back then and according to EPA, 63 water systems serving an estimated 5.5 million individuals detected PFOA and/or PFOS at levels above EPA’s 2016 health advisory level of 70 ppt (separately or combined). Several states took action on their own to protect their citizens. As the US EPA did not take additional action for a decade. Then in March 2023, EPA proposed drinking water standards for PFOA and PFOS at extremely low levels and  higher levels for perfluorobutane sulfonic acid (PFBS), perfluorononanoic acid (PFNA), perfluorohexane sulfonic acid (PFHxS), and hexafluoropropylene oxide dimer acid (HFPO-DA) and its ammonium salt (also known as the GenX chemicals). EPA has not finalized their regulations and the methods of analysis for the regulatory levels has not been yet achieved.

Nonetheless, in Massachusetts my brother received a notice from his water company that they had found a PFAS6 result that exceeded the Massachusetts Maximum Contaminant Level (MCL) for drinking water. He and his partner asked me what they should do. A little background.

There are thousands of PFAS chemicals, and they are found in many different consumer, commercial, and industrial products. This category of chemical has been widely used for over 80 years mainly for their ability to repel oil, grease, water, and heat. PFOS and PFOA found in Scotch Guard and an ingredient in Teflon and traditional Aqueous Film-Forming Foam (AFFF) - the Class B firefighting foam used to fight aviation and other chemical fires -were the first to become widely commercially successful.

PFAS was also used in spray coatings for cans and food packaging. Wash water from light manufacturing or processing. Stain resistant and flame resistant treatments to carpeting, upholstery, clothing. Food with PFAS containing packaging picked up traces of PFAS and it was passed onto people that way, too. Basically, PFAS us ubiquitous. When 3M the former manufacturer of PFOS took blood samples of people exposed to PFAS, they could not find a control group that did not have PFAS in their bloodstream.

MassDEP is working with small public water systems like his to implement treatment for PFAS so that all community drinking water will meet the MCLs. PFAS can effectively be removed by treatment systems at least below the MassDEP MCL, but it is not known how effective they are at achieving the US EPA proposed SDW limit.

Right now the technology out there to remove PFAS is: activated carbon treatment, anion exchange, nanofiltration or reverse osmosis.

Activated carbon treatment is the most studied treatment for PFAS removal. Activated carbon is commonly used to adsorb natural organic compounds, taste and odor compounds, and synthetic organic chemicals in drinking water treatment systems. Adsorption is both the physical and chemical process of accumulating a substance, such as PFAS, at the interface between liquid and solids phases. Activated carbon is an effective adsorbent because it is a highly porous material and provides a large surface area to which contaminants may adsorb. Activated carbon (GAC) is made from organic materials with high carbon contents such as wood, lignite, and coal; and is often used in granular form called granular activated carbon (GAC). At analysis levels available in the past, GAC has been shown to effectively remove PFAS from drinking water when it is used in a flow through filter mode after particulates have already been removed.

Another treatment option is anion exchange treatment, or resins. There are two broad categories of ion exchange resins: cationic and anionic. The positively charged anion exchange resins (AER) are effective for removing negatively charged contaminants, like PFAS. Of the different types of AER resins, perhaps the most promising is an AER in a single use mode followed by incineration of the resin. Once more limitations of analytical method has hindered verification, but like GAC, AER removes 100 percent of the PFAS for a time.

High-pressure membranes, such as nanofiltration or reverse osmosis, have been extremely effective at removing PFAS.  Reverse osmosis membranes are tighter than nanofiltration membranes.  This technology depends on membrane permeability.  A standard difference between the two is that a nanofiltration membrane will reject hardness to a high degree, but pass sodium chloride; whereas reverse osmosis membrane will reject all salts to a high degree exactly how high a degree needs to be confirmed when the test methods are available.  

Though my brother’s water company suggested using bottled water for at risk populations, that may not be a good idea. A study published in August 2021  and led by Johns Hopkins University researchers, found PFAS substances in 39 out of more than 100 different brands of bottled water tested.  Since, I know that his partners is very risk adverse, I advised an interim measure: use a low cost system activated carbon filtration. A couple of years back the Environmental Working Group tested home systems for effectiveness at removing PFAS. Once more they were limited by the then available test methods to know how much PFAS was removed. The systems EWG recommends were inexpensive and 100% effective to the limit of analysis of removing PFAS.

So, you can just go the EWG website and click through one of the less expensive water filters and give it a try.  After a reliable test method and verification of removal takes place you can reevaluate this decision.

Wednesday, November 8, 2023

Groundwater an Essential Part of our Water Supply

Groundwater is the moisture and water that exists in the spaces between rocks, the pores in the soil and fractures in the geology-the invisible portion of the water cycle. Groundwater is renewed through precipitation, which is often seasonal, but can be extracted year-round. Provided that there is adequate replenishment, and that the source is protected from pollution, groundwater can be extracted indefinitely and can be robust in the face of drought.

However, mankind is rarely prudent. Increase the amount of groundwater extracted, then slowly over time the aquifer is used up. Development adds people and industry increasing the demand for water while adding roads and buildings that prevent the infiltration of precipitation into the ground.  Essentially, reducing the recharge of the aquifer while increasing the demand for water a potentially unsustainable combination. Increase water use or reduce recharge by eliminating forested areas and replacing with compacted soils (lawns that need to be watered), pavement, buildings and over time the aquifer will become exhausted.

Groundwater is both used for water supply and serves to support steam flow between rain storms. Groundwater comes from rainwater and snow melt percolating into the ground. Typically, the deeper the well the further away is the water origination and the older the water. The groundwater age is a function of local geology, the amount of precipitation and the rate that water is pumped out of the aquifer. Geology also determines the ease with which water and contaminants can travel through an aquifer and the amount of water the land can hold. The land surface through which groundwater is recharged must remain open and uncontaminated to maintain the quality and quantity of groundwater.

The groundwater cycle in humid and arid regions differ fundamentally from each other. In humid climates, with high rainfall, large volumes of water seep into the groundwater, which contributes actively to the water cycle feeding streams, springs and wetlands during periods when the rainfall is lower. In semi-arid and arid climates, there is by contrast practically no exchange between the surface water and groundwater because the small volume of seepage from the occasional rainfall only rarely penetrates the thick and dry (unsaturated) soils. That groundwater tends to be much deeper and isolated from surface contact. In currently arid areas groundwater resources are only minimally recharged. Our understanding of the complete water cycle is still only rudimentary.

We do know that groundwater availability varies by location. Precipitation and soil type determines how much the shallower groundwater is recharged annually. However the volume of water that can be stored is controlled by the reservoir characteristics of the subsurface rocks. Water resources can be used sustainably only if their volume and variation through time are understood. However such information is often lacking. Hydrology as a science is very young and so little is known. Any attempt to accurately model the groundwater component of the water cycle requires adequate measurements and observations over decades. The computer models in common use in the United States only address the shallower groundwater and surface water interactions and tend to assume linear relationship which scientists are finding is not accurate.

Groundwater is usually cleaner than surface water. Groundwater is typically protected against contamination from the surface by the soils and rock layers covering the aquifer. This is the only available clean drinking water in many parts of the world. However, rising world population, changes in land use and rapid industrialization are increasingly place groundwater in jeopardy. Once contaminated, groundwater is very difficult to clean and often after removal of contaminated plumes only long term abandonment of use to allow for natural attenuation is the only possible course of action. As droughts and water shortages appear the value of groundwater has begun to be more fully appreciated. Precious groundwater resources increasingly need to be protected and well managed to allow for sustainable long-term use.

Water-table aquifers are usually shallower than confined aquifers and because they are shallow, they are impacted by drought conditions and surface contaminants more easily than confined aquifers. Thus, most public supply water wells draw from the deeper confined aquifers. The water is drawn from the fine-grained confining layers called aquitards. Water enters these aquitards very, very slowly and the danger in utilizing them for supply is that they become overdrawn.  When that happens an irreversible compaction of the fined-grained confining layer occurs and there is permanent subsidence. The land surface falls, permanently reducing the storage capacity of underground aquifers, threatening future water supplies.

The demand for water is rising as population, economic activity and agricultural irrigation grow. We need to manage our water resources in a sustainable way if we are to have a future. To survive over time, a population must live within its available resources. Water is essential for life. We need water for drinking, bathing, irrigated agriculture and industry. A large portion of the fresh water on earth is groundwater. It moves and changes over time and there limits to the amount of groundwater available for extraction from an aquifer. To be sustainable, the amount of groundwater removed from an aquifer needs to match the recharge rate.

Sunday, November 5, 2023

HEPA Filters Reduce the Incidence of Covid

 


I grew up in a time before widespread central air conditioning and during a time when windows opened and buildings leaked like sieves. In recent decades building maintenance have perfected sealing leaks and minimizing number of air changes per hour. First there was sick building syndrome, then SARS-CoV-2, the virus that causes COVID-19 arrived and we remembered why we might want fresh air.  Ensuring proper ventilation with outside air can help reduce indoor airborne contaminants, including COVID-19, and other viruses. Also, HEPA air filters are good for removing wildfire smoke and we should consider investing in one for our homes.

In a study recently reported at the World Health Organization Europe Indoor Air Conference, researcher Catherine Noakes reported on the University of Leeds study where thirty primary schools were assigned to three groups balanced for school type (building; ethnicity; free school meals; total student numbers). These were randomly allocated to three groups: control; HEPA-Air Cleaning Technologies; UVC-Air Cleaning Technologies. All schools were predominantly naturally ventilated and relied on manual opening of windows and doors to ventilate classrooms. All classrooms were equipped with air quality monitors. 

The researchers found that air quality data indicated comparable ventilation rates between groups, and a 48% mean reduction in particulate matter in the HEPA classrooms which were equipped with a free standing HEPA blow filter about the size of  20 gallon kitchen trash can. The HEPA filter in classrooms was found to  reduced the number of covid-19-related sick days by more than 20%.

Only absences related to covid were tracked, but the researchers also believe that the HEPA filters probably also cut other respiratory illnesses like colds and flu. Air filters can cost several hundred dollars and can be noisy, but really produce results. Other studies of the use of HEPA air filters in a hospital in Cambridge UK is expected in the near future.

Wednesday, November 1, 2023

The Carbon Credit Project in Zimbabwe Collapses

Carbon offsets allow firms and individuals to pay to offset or compensate for the carbon emissions they create.  To eliminate the carbon footprint of airplane flights, a building project, or data centers  by paying to pull carbon out of the air elsewhere. Voluntary offsets have developed into a billion-dollar global market. Now it appears that the carbon credits may just be greenwashing.

Until 2011 most carbon offset projects focused on building out renewable energy and addressing sources of methane to reduce greenhouse gases. This was because land-based projects, such as those involving agriculture and forestry, had been excluded from the  European Union emission trading system under the Kyoto Protocol. As a result, almost no forestry projects were developed. However, in 2011 the preventing deforestation became more desirable on the world stage and it became possible to sell these type of carbon offsets.

The first and largest project was Kariba REDD+ a  forest conservation project  aimed at providing sustainable livelihood opportunities for poor communities in Northern Zimbabwe while locals maintained and cared for the forest. Over the next decade, Kariba’s REDD+ carbon credits were the basis of the claims of breakthrough progress on cutting emissions from its corporate clients. The project has generated $100 million by selling credits for more than 23 million tons of greenhouse gas emissions. However, during the past year serious questions have been raised.

Earlier this year news reports began appearing in Europe and in Bloomberg Green that charged the project had overestimated its climate benefits by at least a factor of five while delivering much less money than indicated to communities in Zimbabwe. Last month, a report in The New Yorker  (“The Great Cash-for Carbon Hustle”) raised the concerns about the illegal movement of money and claimed that the carbon reductions were not real. Based on these reports, the Washington, D.C.-based certification body Verra, the world’s leading carbon standard setter for the offsets market, announced it had launched an investigation into the Kariba project. Verra has said that the project will remain on hold, along with “any further credit issuances” until the probe is complete.

South Pole, the world’s leading seller of carbon offsets, has terminated its involvement in its flagship forest protection project in Zimbabwe following these recent allegations and the entire carbon offset market is in turmoil. The collapse of Kariba REDD+ could also endanger the viability of the rest of carbon market, which has slowed this year amid quality concerns, regulatory investigations in the United States and accusations of greenwashing by the United Nations. These problems and weather risks are also undermining the market’s underlying insurance mechanisms, known in the industry as the credit buffer pool. Climate change is bringing increased risks to natural landscapes from drought, wild fires, and invasive bugs and species and some fear these insurance mechanisms are under-capitalized.

A nonprofit, CarbonPlan, voiced concern as early as 2020 that forest fires could easily burn through the buffer pool in California’s carbon market and wildfires this past summer have damaged a carbon offset project. Researchers at the University of California at Berkeley found estimate that forest offset projects underestimating the risk from natural phenomena by a factor of 10.