Wednesday, December 28, 2022

Power, War and CO2 Emissions

 2022 will prove to be a year when the world as we knew it shifted and changed. Putin’s unprovoked invasion of Ukraine was in part responsible for our changed views. As Putin has turned the war in Ukraine to mass civilian attacks using missiles and drones armed with high explosives targeting cities and the power infrastructure it appears he has pulled out the old playbook:  to terrorize the population so they would lose the will to fight and surrender. Instead, this war has strengthened our NATO alliance. It has become clear that our NATO partners are our friends. China, Russia and Iran are not.

Ukraine’s air defenses are weak, and Ukraine cannot retaliate in kind. Putin thinks strategic air attacks are his winning card. Ukrainian President Volodymyr Zelenskyy arrived in the United States this month with the goal to improve Ukraine’s air defenses. The United States needs to stop worrying about provoking Putin and worry about helping Ukraine stop him.

We have also learned over and over that China is not our friend and is eying expansion through invasion and proving they are the rightful first among nations.  We must rebuild our waning military and not allow it to suffer the same decay as Russia’s. Stand with our friends, arming and training them ahead of invasion is a good plan, too. The surveillance state and thought police are bad. We need to hear all voices or be caught unaware.

The war in Ukraine has also changed how we look at power, dependence on other nations, infrastructure and power generation portfolio.  Though the United States, the European Union and others have imposed economic sanctions on Russia and have announced plans to wean themselves off that country’s fossil fuels; Russia still supplies 40% of the natural gas for the European Union. As energy prices soared worldwide the United States went begging to other nations to produce more oil and gas. The only nation we did not encourage to produce more oil and gas was our own.

As Alaska Senator Dan Sullivan tweeted “This is national security suicide. shuts down energy production in America—particularly in Alaska—then goes on bended knee to dictators in countries like Iran, Saudi Arabia & Venezuela, begging them to produce more energy.”

Our leaders discussed windfall profits tax on energy companies and drew down 180 million barrels of our strategic oil reserve.  We need to look at our own energy portfolio and power use. Sustainability and reliability are as important as decarbonizing the electric generation in the United States. Extreme temperatures and prolonged severe weather conditions are increasingly impacting our power systems.

Extreme weather impacts increases electricity demand and forcing generation and other resources off-line. At the same time preference is given to the use of natural gas for heating rather than electricity generation. While the PJM has sufficient capacity to meet resource adequacy requirements, it may not have sufficient availability of resources during extreme and prolonged weather events as we almost discovered Christmas weekend. Peak electricity demand is increasing, and forecasting demand and its response to extreme temperatures and abnormal weather is increasingly uncertain. Specifically, electrification of residential heating requires the system to serve especially high demand on especially cold days.

Over Christmas weekend PJM and Dominion Power requested the public in its region to conserve electricity. Suggesting that electricity customers take simple electricity conservation steps such as:

  •  Setting thermostats lower than usual, if health permits.
  • Postponing use of major electric appliances such as stoves, dishwashers and clothes dryers until other times, and
  • Turning off non-essential electric lights, equipment and appliances.

The call for conservation was prompted by the inadequacy of our power supply to meet the demands of region wide frigid weather. We are a nation blessed with a plethora of natural resources. If we cannot heat and power our homes and ovens on a very frigid Christmas Eve, we are a declining nation.

Here at home the energy needs of the Commonwealth, its businesses and its families are changing – and growing. Virginia is already the data center capital of the world, and the industry is exploding along with the demand of 24 hours a day 7 days a week power needed to run them. At the same time Virginia has been on a short timeline to decarbonize the grid and electrify transportation and heating. When the wind does not blow or the sun does not shine, or the polar vortex arrives our grid needs to power our lives. 

Virginia has been outsourcing reliable baseload capacity to other states within the PJM, and increasing Virginia’s dependence on electricity imports from West Virginia and Pennsylvania. As a result, supply and transmission of energy to Virginia homes and businesses has the potential to become less reliable than today and could end up with rolling blackouts during both heat spells and cold snaps if we do not plan better.

According to Governor Glenn Youngkin, a growing Virginia must have reliable, affordable and clean energy for Virginia’s families and businesses. The Virginia Energy Plan must meet the power demands of a growing economy and ensures Virginia has that reliable, affordable, clean and growing supply of power by embracing an all-of-the-above energy plan that includes natural gas, nuclear, renewables and the exploration of emerging sources to satisfy the growing needs of Commonwealth residents and businesses.

Sunday, December 25, 2022

Freshwater Salinization Syndrome

Freshwater contains natural salts and minerals. However, dramatic increases in salt concentrations are occurring due to human activities including road salt application, water softening, mining and oil production, commercial and industrial processes, weathering of concrete, sea level rise, and fertilizer application. Over the last several decades, scientists have measured increases in salt concentration in several rivers around urban areas including the Potomac River and the Occoquan Reservoir. The salinity in the reservoir has been rising over time and may be reaching a critical stage. 

Many different types of salts contribute to freshwater salinization including sodium, chloride, potassium, calcium, and magnesium. Too much salt in freshwater can harm aquatic life, but there's more to the problem than that. Increased salt concentrations lead to a phenomenon called freshwater salinization syndrome (FSS). This syndrome is due to direct and indirect effects of salts that cause other pollutants in soil, groundwater, surface water, and water pipes to become more concentrated and mobile.

With rising salt levels comes rising chloride concentrations, all salts contain chloride which forms a solution in water with available free chloride. One example of these effects is that salts can increase the rate of metals reacting and  mobilizing from soils and pipes and can cause the breakdown of infrastructure. This process is called galvanic corrosion. The rising salinity is also associated in some areas with changing water chemistry. Sulfate levels are decreasing and alkalinity is rising. These are other factors that influence corrosion  in our infrastructure.  

Excess nutrients in the soil like nitrate-nitrogen can also be mobilized by high salinity, thereby exacerbating nutrient pollution, which contributes to the increasing presence of dead zones and/or harmful algal booms. Radioactive materials such as radium uranium naturally occurring in soils in Virginia can also be mobilized and become more concentrated in groundwater and surface water. Excess salts can make water undrinkable, increase the cost of treating water, and harm freshwater fish and wildlife.

The Occoquan Reservoir is an important part of our region’s drinking water supply, providing about 40% of the clean drinking water for around 2 million people. Though sodium mass loading to the reservoir is primarily from watershed runoff during wet weather and reclaimed water during dry weather, sodium concentration in the reclaimed provided by the Upper Occoquan Service Authority wastewater treatment plant water is higher than in outflow from the two watersheds at the present. However, the new Comprehensive Plane recently approved by the Board of County Supervisors will accelerated industrial, commercial and residential development in the Bull Run and Occoquan river watersheds. History has told us that development increases salinity. The massive development on currently open and forested land in Prince William will accelerate the rate of rising salinity. The solutions will not be cheap. – Desalination for the drinking water supply, new shortened life cycle for water, wastewater and distributions systems, nitrogen removal to meet the requirements of the Chesapeake Bay TMDL and so much more.

Wednesday, December 21, 2022

EPA Announces Bipartisan Infrastructure Grants

Over the past couple of weeks there have been a series of announcements by the U.S. Environmental Protection Agency of the selection of dozens organizations to receive a total of more than $26.7 million in grants funded through President Biden’s Bipartisan Infrastructure Law. These grants were all for environmental job training programs across the country under the Brownfield program. The grants will be dispersed through EPA’s Brownfields Jobs Training Program will recruit, train, and place workers for community revitalization and cleanup projects at brownfield sites. 

In the various press releases EPA Deputy Administrator Janet McCabe said “President Biden’s Bipartisan Infrastructure Law is supercharging EPA’s Brownfields Program, which is transforming blighted sites, protecting public health, and creating economic opportunities in more overburdened communities than ever before.”  

This is apparently all being done by Job Training. The Bipartisan Infrastructure Law, allocated more than $1.5 billion to EPA’s Brownfields Program which seeks to redevelop former industrial and contaminated sites. This historic investment enables EPA to fund the advancement of the environmental curriculum in job training programs that support job creation and community revitalization.

The Brownfields Jobs Training Program also advances President Biden’s Justice40 Initiative, which aims to deliver at least 40% of the benefits of certain government programs to disadvantaged communities. Based on data from the Climate and Economic Justice Screening Tool, approximately 97% of the communities selected to receive funding have proposed projects in historically underserved areas.

Brownfields Job Training  grants are made to nonprofits, local governments, and other organizations to recruit, train, and place unemployed and under-employed residents of areas affected by the presence of brownfield sites. Their graduates develop the skills needed to secure full-time, sustainable employment in various aspects of hazardous and solid waste management. All good, but not what I imagined infrastructure to be. I had thought that infrastructure money would go to infrastructure-  pipes and equipment- all the things that make up our infrastructure. Not job training programs.

Our nation’s drinking water infrastructure system is made up of 2.2 million miles of underground pipes that deliver drinking water to millions of people. There are more than 148,000 active drinking water systems in the nation, thought just 9% of all community water systems serve 78% of the population- over 257 million people. The rest of the nation is served by small water systems (about 8%) and private wells (about 14% of the population). There is a water main break every two minutes and an estimated 6 billion gallons of treated water is lost each day to leaks and water main breaks.

Funding for drinking water infrastructure has not kept pace with the growing need to address the aging infrastructure. Despite the growing need for drinking water infrastructure, the federal government’s share of capital spending in the water sector fell from 63% in 1977 to 9% of total capital spending in 2017. This is just on tiny corner of the Bipartisan Infrastructure Law Bipartisan Infrastructure Law, but an example of how money moves around the government into favored programs. I was hoping more of the infrastructure money would go to renewing our long neglected physical water infrastructure for future generations.

Monday, December 19, 2022

Reducing PWC Carbon Footprint

 

On November 17, 2020, the Prince William County Board of Supervisors adopted Climate Mitigation and Resiliency goals and authorized the creation of a Sustainability Commission.  The Commission is charged with advising on potential enhancements to the Community Energy and Sustainability Master Plan (CESMP), which will provide the map for how the county will reach its climate goals that includes Prince William County achieving 50% of 2005 CO2 emissions by 2030 and net-zero by 2050; but also include plans for adaption to climate change.

At the close of the COP-27 meeting the UN Secretary-General, Mr. Guterres, decried greenwashing – misleading the public to believe that a company or entity is doing more to protect the environment than it is.  He called for bring integrity to net-zero commitments by industry, financial institutions, cities and regions and to support a global, equitable transition to a sustainable future.

“Using bogus ‘net-zero’ pledges to cover up massive fossil fuel expansion is reprehensible. It is rank deception. This toxic cover-up could push our world over the climate cliff. The sham must end.” Mr. Guterres said that net-zero pledges should be accompanied by a plan for how the transition is being made. PW County is engaged in developing such a plan. The Sustainability Commission will provide the feed back on the plan for Prince William County.

 

Early in 2020, the General Assembly passed the Virginia Clean Economy Act (VCEA), which mandated a goal of 100% zero-carbon energy generation by 2050. This was to provide the major tool in achieving the County goals. Since 2010 Virginia electricity use has grown by about 40% while carbon intensity has decreased by almost 30%. This has been due to changing from coal to natural gas generation for a significant portion of electric power generation. 

Under the VCEA, Virginia is legally required to retire all baseload generation, except for the existing nuclear power plants, in favor of intermittent renewable generation. The VCEA as of 2020 would require additional solar panels enough to cover an area the size of Fairfax County (according to Dominion); and certainly requires technological advancement in power generation and storage. This is not realistic- it is the equivalent of a plan with a line that says "insert miracle here.".

  •  In 2020, natural gas accounted for 61% of Virginia's utility-scale electricity net generation, nuclear supplied 29%, renewables, mostly biomass, provided 6%, and coal fueled less than 4%.
  • Virginia’s Bath County Pumped Storage Station, with a net generating capacity of 3,003 megawatts, is the largest hydroelectric pumped storage facility in the world.
  • 21% of the power used in Virginia was generated in other PJM locations (West Virginia and Pennsylvania)

 The VCEA is facing challenges that may prevent it from achieving its goals in the stated time frame. According to VA Energy VCEA requires the Commonwealth to retire its natural gas power plants by 2045 (Dominion) and 2050 (Appalachian Power). These facilities currently comprise 67% of the current baseload generation as well as 100% of the power plants that meet peak demand. This switch mandated by VCEA has not been successfully accomplished anywhere in the world, yet. Advances in technology were always necessary to achieve the goals and those advances have not come fast enough.

According to Virginia Energy, during the foreseeable future, intermittent energy generation cannot meet all of Virginia's energy needs. At this time, solar and wind generation are affordable in many locations, but battery storage systems required to turn these generation sources into dispatchable energy are cost prohibitive. At the same time the extraordinary growth in electricity demand by the exploding number of data centers under development in Virginia requires that the Commonwealth increase the effective base load to meet what is forecast to be a 30% increase in electricity usage by 2040 (estimated by UVA Cooper Center).

To meet Virginians’ round-the-clock energy needs, full compliance with VCEA will require a reliance on other PJM states to produce the baseload generation capacity for the Commonwealth. However, that is not possible. The PJM has required that Dominion provide more generation into the system to meet the growing demand from data centers now and not wait for future technology.


It appears that electrifying the transportation sector which faces hurdles in accessibility for all, and the heating systems county wide will not meet the stated goals and timing of the resolution with the growth in power demand and land use changes approved in CPAs and the updated Comprehensive plan. However, there is hope that Prince William will be able to “bend the curve” if smart decision are made. The Community Energy and Sustainability Master Plan is being developed and includes adapting the county to impacts from climate changes.  You will have an opportunity to attend Townhalls meetings this spring to voice your concerns, ideas and provide feedback in the development of the Community Energy and Sustainability Master Plan. Please participate in the process and help us build a sustainable future for our children. 

  


 

Wednesday, December 14, 2022

Air Source Heat Pumps

 

from DOE

Heat pumps transfer the heat from a heat source to a heat sink using mechanical power. Different sources such as water, air and earth can be utilized as heat source for heat pump cycles. Air source heat pumps have been widely used in the south and mid-Atlantic states for decades. Warmer air leads to improving the performance of the heat pump cycle on air source heat pumps.  One of the main challenges using air source heat pumps is the effects of variations in climate and weather conditions on the operation of the heat pump cycles.

Air-source heat pumps take heat from the outside air, run it through a refrigeration cycle to step up the temperature, and deliver warm air to a building in winter and use the reverse cycle to cool the building in summer. One benefit of air source heat pumps in less extreme climates is that they can provide both heating and cooling, with the simple switch of a valve. Though, my friend insists that the air blown into the room is too cold. Though you do not get the hot blast of a gas furnace, I don’t experience cold air and think she needs to insulate her ducts.

Though the optimal operating point at maximum efficiency depends on geometric, thermodynamic and physical properties of the system components like the ducting and working fluids; in general. the colder it is outside the less efficient the heating cycle is. For years heat pump design has undergone improvements and changes to improve their performance in more extreme weather. Today, there are air source heat pumps available that will operate in more extreme conditions using less energy than the air source heat pumps of the past. 

Improvements continue. Research is on going is Alaska, Montana and Washington state to look at how the efficiency of a heat pump drops in colder temperatures, and how to appropriately size a heat pump for a home. The air-source heat pumps available today can keep your home warm even amid bone-chilling cold, using less energy than other types of heating systems. Or so the researchers say. One of the options to address heat loss in a duct system or avoid the cost of installing ducting is to utilize ductless min-split heat pumps in a home or addition.  

Ten years ago when I replace the failed air source heat pump in my house, I altered the size, changed the ducting configuration and materials, insulated the ducts, and bought a much more efficient heat pump with a variable speed. I ended up with a more comfortable home and a lower electric bill. I have been really happy with my current system, but heat pumps last only 8-12 years (in general) so I have been looking at the options on the market today.

There’s no official “cold climate” standard for heat pumps yet, but next month the U.S. Department of Energy, Energy Star program will introduce a cold-climate standard for air-source heat pumps. These heat pump will include a certification for air source heat pumps with an “acceptable” level of low-temperature performance and efficiency. 

For now, you will have to depend on the Energy Guide label which displays the heat pump's heating and cooling efficiency performance rating. Every residential heat pump sold in the United States has an Energy Guide label, look for it.

Heating efficiency for air-source electric heat pumps is indicated by the heating season performance factor (HSPF), which is a measure over an average heating season of the total heat provided to the conditioned space, expressed in Btu, divided by the total electrical energy consumed by the heat pump system, expressed in watt-hours. This is measured in set environmental conditions which is not going to occur outside your house on a regular basis. It is only a measurement for comparison like miles per gallon. Understand it as a measurement tool.

Cooling efficiency is measured by the seasonal energy efficiency ratio (SEER), which is a measure over an average cooling season of the total heat removed from the conditioned space, expressed in Btu, divided by the total electrical energy consumed by the heat pump, expressed in watt-hours. Once more this is measured in set environmental conditions.

The higher the HSPF the more effective the heat pump is on cold days at heating and the higher the SEER, the more effective the heat pump is at cooling on hot days. In general, the higher the HSPF and SEER, the higher the cost of the unit. However, the energy savings can return the higher initial investment during the heat pump's life. As I found when I replaced my first heat pump the new heat pump used less energy, substantially reducing air-conditioning and heating costs, but was also able to cool my house on the hottest of days and keep it warm in the winter. In our 4-season climate, getting the highest SEER and HSPF really paid off in overall satisfaction and operating cost savings.

Sunday, December 11, 2022

Gainesville West Data Center

 The Atlantic Research Corporation former Superfund Site (VAD023741705) at 5945 Wellington Road, Gainesville, VA is being redeveloped into a series of Data Centers by Amazon Corporation. The site is 117 acres in the Data Center Overlay District. They have applied for a Special Use Permit to increase the height of the buildings to 100 feet from the 75 allowed by right. Currently, the site is being graded for construction.

 As part of the redevelopment plan the on-site stream is being reconfigured and moved. In the planning package there are no permits for moving the stream or studies if the site development and stream relocation will impact the stability of the site institutional controls or create a pathway of exposure for the contaminants known to remain in the groundwater. In addition, the Environmental Constraints Analysis has been waived by the county along with the Perennial Flow Analysis for the on-site stream.

In a June 14, 2018 the U.S. EPA indicated that ARC had satisfactorily completed corrective actions pursuant to the CMI Order, Sections VI. H7 and I8.  In addition, Atlantic Research Corporation had taken Interim Measures to address perchlorate and other constituents of concern in soils.  However, EPA approved leaving contaminants in soil at the site above residential risk management levels, and at concentrations above risk-based standards in groundwater.  To ensure that the contaminants left on site would not injure the public or leave the site,  the regulators required that the Final Facility-wide remedy include ongoing groundwater monitoring and Institutional Controls (IC) Including:

  • Groundwater use restriction on the entire Facility
  • Soil is restricted on certain areas of the western parcel and included as part of the IC plan.
  • Vapor Intrusion controls are required for a specific area shown on a figure of the IC plan representing groundwater exceeding VISL.

According to the U.S. EPA website Atlantic Research Corporation continues to operate and manage a groundwater extraction and treatment system (Northern Deep Treatment System) with performance monitoring at on-site monitoring wells locations, including monitoring wells located at the downgradient property boundary. The groundwater extraction system includes approximately 53 monitoring wells that are sampled semi-annually or annually.

Constituents of concern (COCs) currently meet risk-based cleanup standards at the downgradient property boundary and there has been no off-site groundwater or surface water contaminant migration detected.

Pollution of surface water can cause degradation of ground-water quality and conversely pollution of ground water can degrade surface water. Thus, effective land and water management requires a clear understanding of the linkages between ground water and surface water as it applies to any given hydrologic setting. Within Prince William County Virginia there are four distinct geologic provinces: (1) the Blue Ridge, (2) the Culpeper Basin, (3) the Piedmont, and (4) the Coastal Plain. The U.S. Geological Survey divides the four geologic provinces of the county into seven hydrogeologic groups based on the presence and movement of the ground water calling them groups: A, B, B1, C, D, E and F.

It appears that in the area of this particular site, hydrogeologic group B and C are present.

Hydrogeologic group B underlies the western part of Prince William County and consists of sedimentary rocks of the Culpeper Basin. The predominant rock types are conglomerates, sandstones, siltstones, shales, and argillaceous limestones. Rocks within hydrogeologic group B tend to have moderate to excellent water-bearing potential because it is a fractured rock system with very little overburden. The highest reported yields in the county are from wells located in this hydrogeologic group, but this hydrogeologic group is susceptible to contamination- the fractures that carry water can easily spread a contaminant and without adequate overburden spills could flow to depth through a fracture.

Hydrogeologic group C, which is interspersed throughout the area of groups B and Bl, in the western part of the County, consists of igneous rocks (basalt and diabase) of the Culpeper Basin. The rocks of group C are Early Jurassic in age. The predominant rock types are basalt, sandstone, siltstone, diabase, hornfels, and granofels. Rocks within hydrogeologic group C tend to have generally poor water-bearing potential because of the wide spacing between fractures, mineralization of fractures, and random fracture orientations.

I am concerned that development of the site and relocation and reconfiguration of the stream could potentially mobilize the chemical constituents of concern known to remain on-site:

Tetrachloroethylene (PCE), 1,1-Dichloroethene (1,1-DCE), Methylene Chloride, 1,1,1-Trichloroethane, 1,1,2-Trichloroethane, 1,1-Dichloroethane (1,1-DCA), 1,2,3-Trichloropropane, 1,2-Dichloroethane (1,2-DCA), Benzene, Carbon Tetrachloride,  Chloroethane, cis-1,2-Dichloroethene, Trichloroethene (TCE), Vinyl Chloride (VC), Perchlorate and 1,4-Dioxane.  The following COCs exceed vapor intrusion screening levels: PCE, TCE, VC, Benzene, 1,1-DCA, 1,2-DCA, and 1,1-DCE.

The U.S. EPA, VA DEQ maintain responsibility for the site. The specific address/Parcel ID from the special use approval does not have a joint permit application in process or an active VWPP permit. Any surface water impacts from the eventual data center would likely require a Virginia Water Protection Permit (VWPP) and a permit from US Army Corps of Engineers.

Though it is likely that such an application is being developed, because surface water impacts are ultimately proposed; however, the applications for the required permits have not been submitted to DEQ/USACE and the site is being graded without regard for the soil restrictions in the institutional controls and to all appearances the perennial stream and RPA has been wiped out.  No RPA  hearing was held. This should not happen. The environmental regulations and institutional controls exist to protect the community and essential water resources from exposure.

It is important that the U.S. EPA, VA DEQ  and USACE (who maintain responsibility for the site) approve the planed changes to the site that may impact the institutional controls and hydrology and determine if additional monitoring is necessary to ensure that the COCs are not mobilized to leave the site, before excavation and site work begin. Not after construction. The regulations are to protect us. The  county has failed in its duty and is not protecting the interests of the pubic in waiving the Environmental Constraints Analysis, RPA disturbance hearing, and granting the excavation permit ahead of the VWP permit.

Wednesday, December 7, 2022

2022 Dead Zone Update

Overall, the total volume of the 2022 Dead Zone in the Chesapeake Bay was the second lowest since 1985. The “Dead Zone” of the Chesapeake Bay is the common name given to the  volume of hypoxic amount of water with dissolved oxygen concentrations less than 2 mg/L- too low for aquatic organisms such as fish and blue crabs to thrive.

Hypoxia in 2022 started relatively late in the season. Around the beginning of June, the Dead Zone started to spread through the mainstem of the Bay. Although the size of the Dead Zone increased from June through July, the volume of hypoxic water stayed relatively low compared to the historical volumes and the mid-summer peak for the Dead Zone was slightly less than the historical average. The Dead Zone quickly decreased following the mid-summer peak and was effectively ended by the passing of the remnants of Hurricane Ian around the beginning of October. Overall, 2022 was a relatively good (low amount) year for hypoxia in Chesapeake Bay.

In the end the duration of hypoxia in summer 2022 was short and the total annual amount of hypoxia was relatively low, This was a relatively good year for hypoxic conditions in the Bay. On years like this it feels as if the Chesapeake Bay Clean Water Blueprint is making progress

Anchor QES and VIMS



Sunday, December 4, 2022

Beavers Improve Water Quality

As climate change worsens water quality and threatens ecosystems, the wooden dams of beavers may help lessen the damage.  This was the finding of a recent study by Stanford University researchers. Published November 8th 2022 in Nature Communications the study found that the wooden dams built by beavers raise water levels upstream, diverting water into surrounding soils and secondary waterways, the riparian zone. These zones act like filters, straining out excess nutrients and contaminants before water re-enters the main channel downstream.

The hotter, arid conditions that are expected to be brought by climate change will lessen water quality. However, these same conditions are favorable to the American beaver, and may have also contributed to a resurgence of the population and an explosion of dam building in the western United States.

The discovery of the profound impact of beaver dams came about serendipitously. When he was a  PhD student in 2017, lead study author Christian Dewey had started doing field work along the East River, a main tributary of the Colorado River near Crested Butte in central Colorado. Initially, Dewey had set out to track seasonal changes in hydrology, and riparian zone impacts on nutrients and contaminants in a mountainous watershed.

“Completely by luck, a beaver decided to build a dam at our study site,” said Dewey, who is now a postdoctoral scholar at Oregon State University. “The construction of this beaver dam afforded us the opportunity to run a great natural experiment.”

Beavers are semiaquatic mammals partial to freshwater environments.   They have the ability to create their own ecological niche by building dams.  Dam construction has the potential to alter the hydrology, biogeochemistry, and ecosystems of river corridors. Beavers build dams to help engineer their habitat for food supply (riparian and wetland vegetation), to create water bodies sufficiently deep that do not completely freeze during winter in  colder locations, and as a protection from potential predators. The dams and their ponds create riparian discontinuities baht allow a river to cleanse its waters and allow the created wetlands to absorb excess precipitation preventing catastrophic flooding.

To understand how beaver dams may affect water quality in a future where global warming produces more frequent droughts and extreme swings in rainfall, the Stanford researchers compared water quality along a stretch of the East River during a historically dry year, 2018, to water quality the following year, when water levels were unusually high. They also compared these yearlong datasets to water quality during the nearly three-month period, starting in late July 2018, when the beaver dam blocked the river.

Water quality is a measure of the suitability of water for a particular purpose – ecosystem health or human consumption, for instance. During periods of drought, as less water flows through rivers and streams, the concentrations of contaminants and excess nutrients, such as nitrogen, rise. Major downpours and seasonal snowmelt are then needed to flush out contaminants and restore water quality.

The researchers found that the beaver dam dramatically increased removal of nitrate, a form of nitrogen, by creating a surprisingly steep drop between the water levels above and below the dam.The larger the gradient, the greater the flow of water and nitrate into soils, where microbes transform nitrate into an innocuous gas.




For Further reading see:

Dewey, C., Fox, P.M., Bouskill, N.J. et al. Beaver dams overshadow climate extremes in controlling riparian hydrology and water quality. Nat Commun 13, 6509 (2022). https://doi.org/10.1038/s41467-022-34022-0

Dam builders and their works: Beaver influences on the structure and function of river corridor hydrology, geomorphology, biogeochemistry and ecosystems - ScienceDirect


Wednesday, November 30, 2022

New Rule Proposes that Federal Contactors Disclose GHG Emissions

On November 10th the White House announced that the Administration is proposing a new rule- the Federal Supplier Climate Risks and Resilience Rule. This rule would require major Federal contractors to publicly disclose their greenhouse gas emissions and climate-related financial risks and set science-based emissions reduction targets.

 Under the proposed rule, the suppliers and contractors to the Federal government receiving more than $50 million in annual contracts would be required to publicly disclose Scope 1, Scope 2, and relevant categories of Scope 3 emissions, disclose climate-related financial risks, and set science-based emissions reduction targets. Smaller Federal suppliers and contractors with more than $7.5 million but less than $50 million in annual contracts would be required to report Scope 1 and Scope 2 emissions. All Federal contractors with less than $7.5 million in annual contracts would be exempt from the rule. 

According to National Grid, Scope 1 and 2 emissions are direct and indirect emission by an organization. Those emissions that are owned or controlled by a company, whereas Scope 3 emissions are a consequence of the activities of the company but occur from sources not owned or controlled by it. An example of  Scope 1 emissions would be from burning fuel in their fleet of vehicles (if they’re not electrically-powered) or heating a building with a gas furnace.

Scope 2 emission are emissions that a company causes indirectly when the energy it purchases and uses is produced. For example, the emissions from the generation of the electricity they use to power buildings, electric cars and trucks and other electric powered equipment.

Scope 3 emissions encompasses emissions that are not produced by the company itself, and not the result of activities from assets owned or controlled by them, but by those that it’s indirectly responsible for, up and down its value chain. An example of this is when they buy, use and dispose of products from suppliers.

Each major contractor (those with more than $50 million in contracts) would have to publish an annual climate disclosure report that would include a qualitative disclosure of climate-related risks. In addition, each major contractor would also be required to publish science-based targets to reduce GHG emissions in line with what the latest science deems necessary to meet the goals of the Paris Agreement, as validated by a third party. It is unclear how a federal supplier of $50 million in goods or services is going to mitigate the GHG emissions in the grid, other than by buying carbon credits or what the costs associated with compliance will be.  

As UN Secretary-General, Mr. Guterres recently pointed out  the criteria for net-zero commitments can have loopholes wide enough to “drive a diesel truck through. At the recently ended COP-27 meeting Mr. Guterres said that net-zero pledges should be accompanied by a plan for how the transition is being made. “Management must be accountable for delivering on these pledges.” 

The Federal Acquisition Regulatory Council, which consists of the Department of Defense, the General Services Administration, the National Aeronautics and Space Administration, and is chaired by the Office of Federal Procurement Policy in the Office of Management and Budget, has issued this proposed rule under the Federal Acquisition Regulation (FAR). The FAR is the primary regulation for use by all executive agencies in their acquisition of supplies and services with appropriated funds.

Lets look at the Scope 1, 2 and 3 emissions disclosure for Amazon for the period 2019-2012. If you recall, Amazon was a co-creator of The Climate Pledge, a commitment to reach net-zero carbon by 2040—10 years ahead of the Paris Agreement. Since they created the Pledge in 2019, more than 300 companies have joined Amazon in making this commitment. Amazon appears to have no "science based" plan for meeting their commitment." 

With all of growth that Amazon has chased and despite purchasing carbon offsets and solar farms, Amazon’s carbon emission in 2021were 40% greater than they were in 2019. Though, their carbon intensity decreased by 18% over the same period. Carbon Intensity quantifies total carbon emissions, in grams of carbon dioxide equivalent (CO₂e), per dollar of gross merchandise sales. Amazon has not released a plan of how they will  meet their climate pledge. Rather, Amazon states: “As companies invest in new products and services, and their businesses grow substantially, the focus should not be solely on a company’s carbon footprint in terms of absolute carbon emissions, but also on whether it’s lowering its carbon intensity." 




Sunday, November 27, 2022

Solar Adoption and Solar Incentives

When a new consumer technology makes its debut, its adoption rate typically follows a predictable path. The first buyers come from a narrow slice of high-income users or tech enthusiasts who are willing to pay high prices. Over time, as prices fall and economies of scale kick in, sales climb sharply and the technologies become mass-market products. Eventually, the market becomes saturated, and the number of users reaches a plateau.

This pattern of adoption is what was expected for roof top solar photo voltaic arrays, but they have not performed that way. In a recent study published in Joule and led by Zhecheng Wang, a doctoral student in Stanford’s Department of Civil and Environmental Engineering, researchers examined the adoption of solar photo voltaic panels in the United States.

Previously, Stanford researchers had analyzed the number of solar installations at a single point in time. That work quantified that solar arrays were much less common in low-income communities, but it didn’t offer much insight into the pattern of adoption of this technology. From a public policy point of view it is important to understand the pattern of adoption of this technology.

To investigate why, the researchers at Stanford developed a computer model to interpret low image resolution in older satellite imagery to enable the researchers to identify the installation year of PVs from historical aerial and satellite images. The model which they named DeepSolar++  analyzed satellite images to identify where solar panels are and when they were installed in more than 400 counties across the United States. The researchers compiled images from 2006 through 2017 (a narrow time span for the adoption of an expensive and limited gratification and/or status technology) and then combined that data with information about each community’s demographics as well as local financial incentives for solar power

Their analysis showed that low-income communities are not only delayed in their adoption onset but also appear to saturate more quickly at lower adoption levels. Thought the time frame for study may simply be too narrow, to study the full life cycle of solar adoption, the researchers assumed it wasn’t and  examined the correlation of adoption to financial incentives.

Federal, state and local governments have long offered financial incentives, often in the form of rebates on income or property taxes. Performance-based incentives are much fewer. The Stanford researchers used a federal database of state incentives for renewable energy to identify which kinds of incentives were available in each community. This overlooked some local incentives; nonetheless, they found that only upper-income communities seemed to respond to tax incentives.

Their analysis of financial incentives offered on a state by state basis found that performance-based incentives (which reward customers based on how much solar they produce or how much less electricity they buy from the grid) are positively associated with saturated adoption levels for lower-income communities. Causality was not shown, only a correlation. They were unsure why performance-based incentives seem effective among lower-income communities.

from  Article

The researchers pointed out that lower-income families have much lower taxes and thus benefit less from tax both property and income tax breaks. People who rent rather than own their homes have no property taxes at all. The lead author, Ram Rajagopal, speculates that the less common performance-based type of incentive may motivate the owners of apartment buildings. Without financial incentives, installing solar photovoltaic panels to generate electricity is still more expensive than buying electricity from the grid. If you do not fully benefit from financial incentives, you do not install solar. This is simply rational behavior.

As Investopedia pointed out in their excellent analysis of the costs and benefits of solar power, it is capital intensive and the main cost of owning a system comes upfront when buying the equipment. The money has to be paid upfront for design, permits, solar panels, inverters, wiring, installation etc. In addition to installation costs, there are operating and maintenance costs for a photovoltaic solar array. Aside from cleaning the panels regularly, inverters generally need replacement after several years of use, rack systems fail (especially in snow which can lift up the racks) and roofs leak especially after mounting racks on them. In addition, there is great variability in solar production potential based on location and orientation of the roof and shading.


The cold hard number of my solar system

The financial benefits of solar are limited to the long term cost savings generated after the system pays for itself and before the roof needs to be replaced. This can be a short window of time and requires that the home owner live in the house for about 15 years to benefit fully. There are limited emotional benefits of owning solar- which are mostly private and have limited appeal. Unlike driving a Tesla or flashing your iPhone 14 -virtually no one sees my solar panels. I think the Stanford researchers who are clearly interested in public policy, should include a behavioral economist on their team. 

The Stanford study was funded by the U.S. Department of Energy, the National Science Foundation and a Stanford Precourt Pioneering Project award.

Tuesday, November 22, 2022

COP 27 Ends

The 27th Conference of the Parties (COP27) closed its meeting in Sharm el-Sheikh, Egypt with little if any substantial progress. After missing their Friday night deadline, negotiators were able to agree on a commitment to set up a financial support structure for the most vulnerable nations by the next COP in 2023

Yet, while agreement on these issues was welcomed as a step in the right direction, there appeared to be little forward movement on other key issues, particularly on the phasing out of fossil fuels, and tightened language on the need to limit global warming to 1.5 degrees Celsius. In fact, new language added at this meeting included “low emissions” energy alongside renewables as the energy sources of the future is a significant loophole. The undefined term could be used to justify new fossil fuel development against the clear guidance of the UN Intergovernmental Panel on Climate Change (IPCC) and the International Energy Agency (IEA).

The truth is the widow for limiting global temperature to 1.5 degrees Celsius is closing rapidly. CO2 emissions from fuel have continued to grow year after year with the exceptions of a brief respite during the global financial crisis and the Covid-19 lockdowns. Now, CO2 emissions are expected to resume their climb. Coal plants that were scheduled to shut down have continued to operate  and several recently shut down coal fired turbines have be restarted in Europe and China continues to build coal plants. Coal fired electricity generation emits about twice the CO2 as natural gas. 

While a growing number of governments and non-State actors are pledging to be carbon-free, the criteria for net-zero commitments can have loopholes wide enough to “drive a diesel truck through”, the UN Secretary-General, Mr. Guterres, decried as his expert group on the matter published its first report

That report slams greenwashing – misleading the public to believe that a company or entity is doing more to protect the environment than it is, and provides a roadmap to bring integrity to net-zero commitments by industry, financial institutions, cities and regions and to support a global, equitable transition to a sustainable future.

“Using bogus ‘net-zero’ pledges to cover up massive fossil fuel expansion is reprehensible. It is rank deception. This toxic cover-up could push our world over the climate cliff. The sham must end. Mr. Guterres said that net-zero pledges should be accompanied by a plan for how the transition is being made. “Management must be accountable for delivering on these pledges. 

As a member of the Prince William County Sustainability Commission, I look forward to the development of and implementation of the Community Energy and Sustainability Master Plan (CESMP), which will serve as a roadmap for the county to reach its climate mitigation and resiliency goals. 

Sunday, November 20, 2022

Recycled Water

All the water that was or ever will be on earth is here right now and is over 4 billion years old. More than 97% of the Earth’s water is within the in oceans. The remaining 2.8% of water is the fresh water that is within the land masses.  

Of that fresh water- 77%  is estimated to be in icecaps and glaciers (melting away as the planet warms) and the remaining fresh water is stored primarily in the subsurface as ground water. The tiny fraction of a percent of water that remains are the rivers and lakes that supplies the lion’s share of mankind’s needs. River and lakes are repeatedly renewed by rainfall. Rain drops fall to earth and evaporate, infiltrate into the soil, recharge groundwater or flow along the ground to a stream and ultimately flow into rivers and to the ocean-moving always moving.

 As average temperatures at the Earth’s surface rise, more evaporation occurs, which, in turn, should increase overall precipitation. Therefore, a warming climate is expected to increase precipitation in many areas, but not all.

According to NASA: “Current climate models indicate that rising temperatures will intensify the Earth’s water cycle, increasing evaporation. Increased evaporation will result in more frequent and intense storms; but will also contribute to drying over some land areas. As a result, storm-affected areas are likely to experience increases in precipitation and increased risk of flooding, while areas located far away from storm tracks are likely to experience less precipitation and increased risk of drought.

We are past the point where we can try to stop or reverse climate change and hope the climate will return to what it had been. Precipitation will not continue to fall in the patterns of the past. According to the U.S. EPA:  On average, total annual precipitation has increased over land areas in the United States. However, a few areas, such as the Southwest, have seen a decrease in precipitation.  

A mix of growing population, economic growth and changes in precipitation patterns has created a severe water shortage in California that could grow into a crisis in the near future. As drinking water sources become more scarce, California and other states look to directly recycle wastewater to drinking water. Potable reuse systems are up and running around the United States. The Orange County Water District has run the world’s largest water recycling plant since the 1970s. Water providers in Northern Virginia, Atlanta, Georgia, and Aurora, Colorado, also use potable reuse water as part of their drinking water supplies.

Potable reuse, the process of treating wastewater to drinkable standards, offers a reliable and sustainable solution to cities and regions facing shortages of clean water. The city of Los Angeles and water agencies across Southern California are looking into what’s known as “direct potable reuse,” which means putting purified recycled water directly back into our drinking water systems.  What was once called toilet to tap recycling. This differs from the indirect potable reuse we have here, where water spends time in our Occoquan reservoir.

Los Angeles plans to recycle all of its wastewater by 2035 and the California State Water Resources Control Board has been tasked by legislators with  developing a set of uniform regulations on direct potable reuse by Dec. 31, 2023. Meanwhile, a direct potable reuse demonstration facility near Griffith Park will serve to demonstrate the concept.

To accomplish that, the Hyperion Water Reclamation Plant — which like all wastewater treatment plants currently treats wastewater only to the level necessary for release— must be converted into an advanced water purification facility that produces water clean enough to drink.

To complement these plans, a group at Stanford University have be studying the quality of reclaimed water. They expected that potable reuse waters would be cleaner, in some cases, than conventional drinking water due to the fact that much more extensive treatment is conducted for them,” said Stanford professor William Mitch in a new study published in Nature Sustainability.  Toxicological assessment of potable reuse and conventional drinking waters | Nature Sustainability

The engineers found that, after treatment, potable reuse water is cleaner than conventional drinking water sourced from pristine-looking rivers. In most rivers, somewhere upstream is wastewater and stormwater releases  which have much less treatment than occurs in potable reuse systems.

Regulators demand more extensive treatment at potable reuse treatment plants. They specify that treatment systems must remove harmful pathogens, such as viruses and amoebas, and utilities flush out other contaminants using reverse osmosis, ozonation, biofiltration, and other cleaning techniques.

Reverse osmosis treatment pushes water at high pressure through a filter that's so small, it is the method of desalinization. Dr. Mitch and his colleagues discovered the process cleans wastewater as much if not more than groundwater, which is the gold standard. Even when reverse osmosis wasn’t applied, reuse waters were less toxic than the samples of conventional drinking waters sourced from rivers across the United States.

Conventional wastewater treatment plants just aren’t equipped to deep clean. This leaves many organic contaminants, such as chemicals from shampoos and medicines and trace contaminants from our manufactured products, floating down river and straight into a drinking water plant. Direct potable reuse will require building the water treatment plants to remove all these contaminants. Los Angeles is estimating $16 billion to implement these upgrades. However, California is not known for reliable cost estimates for big projects. 

Read the full study: Lau, S.S., Bokenkamp, K., Tecza, A. et al. Toxicological assessment of potable reuse and conventional drinking waters. Nat Sustain (2022). https://doi.org/10.1038/s41893-022-00985-7.

Wednesday, November 16, 2022

Cover Crops, no simple solution

Planting cover crops is a key tenet of conservation agriculture that involves planting non-cash crops on agricultural fields to provide soil cover between primary crop growing seasons. Cover crops primarily benefit future crops. They do this by reducing soil erosion and nitrogen runoff, crowd out weeds, control pests and diseases, increase biodiversity, and improve soil health soil health by helping to build soil carbon.

Building soil carbon serves also to reduce CO2 in theatmosphere. So, cover cropping was well funded under the United States Department of Agriculture's (USDA's) Environmental Quality Incentives Program that turned all agencies towards climate stewardship and has provided more than $100 million of incentives for cover crop adoption each year since 2016. An additional incentive of reduced insurance premiums was added through the Pandemic Cover Crop Program. 

Under these incentives, the total cropland area in the United States planted with cover crops in 2017 was  nearly 50% higher than reported in 2012 and has continued rising in the past five years. It sounds impressive until you realize that overall, in 2017 only about 5% of cropland  used cover crops.

Cover crops (grasses, legumes and forbs) recommended for seasonal cover and other conservation purposes include annual ryegrass, oilseed radish, winter cereal rye, and oats used for scavenging unused fertilizer and releasing nutrients back into the soil for the next crop to use. Good cover crops to break up compacted soils are forage radish  and forage turnip. Similar to commercial nitrogen (N) fertilizers, legume cover crops like crimson clover, hairy vetch and Austrian winter pea can provide some of the nitrogen needs of the primary crop.

Experimental field trials have often found slight yield losses for primary crops. However, these effects appear to vary considerably depending on many factors, including the agricultural region, the combination of cover and primary crop types, weather conditions, and management practices. Results from  field trial varied widely based on the type of cover crop, the level of fertilization, and the date of cover crop termination.

In a new study from Stanford scientists, examines yield loss by using data from actual farmer fields. They used satellite data to observe both the adoption of cover cropping and the yields of corn and soybeans throughout six states in the heart of the US Corn Belt. These observations, cover more than 90,000 fields, are then used in a algorithm developed by others to measure the incremental yield impact of adopting cover crops.

Using the satellite data they could determine the presence or absence of cover crops each year at field-level resolution. They used the previously published Scalable Crop Yield Mapper (SCYM) algorithm to forecast yield. The SCYM uses region-specific crop model simulations and weather to determine yields from satellite pixel data. Because they were using satellite data, their analysis could only represent the yield impacts of cover cropping as practiced in aggregate across the region.

The algorithm results indicated that fields where cover crops were adopted for 3 or more years experienced an average corn yield loss of 5.5%, compared with fields that did not practice cover cropping. The scientists also found on average, soybean yields were reduced by 3.5% following cover crop adoption. Nearly all locations appeared to experience negative effects. In general, impacts appeared most negative in Iowa and Northern Illinois compared with the rest of the study region. These areas were generally associated with better soil ratings, higher mid-season temperatures.

The scientists found greater yield losses for corn than soybean, which they felt was likely due to soybean's lower need for fertilizer nitrogen. They also found that corn yield impacts were significantly more negative on fields with a high soil productivity index (NCCPI). The scientists reasoned that those fields have higher yield potential, they accordingly have higher nitrogen needs to meet their yield potential.

Based on anecdotal observations in our own Prince William Soil and Water Conservation District.  “Small yield losses may be seen in certain situations,  in certain years, and the longer growers work with integrating covers in their systems the better they get at managing them thus reducing these losses.  The other thing they didn't look at was the economics.  Going no-till and using covers reduce fuel and fertilizer used.  Even though yield may be slightly reduced, profit may actually be better.” (Jay Yankey, former Manager PWSWC and current Farmer.)

There is on the ground research supporting the numerous benefits of introducing cover crops into a system, there are also challenges that growers may face in implementation or management. Cover cropping is different in different agricultural systems. Particularly in arid or drought-prone environments, the water needs by cover crops may cause a reduction in the amount available to the main crop, or require the use of supplemental irrigation.

In addition to potential increases in irrigation, there are other economic costs that must be considered. Expenditures for seed and soil preparations as well as labor requirements will change with the introduction of a cover crop. Because cover crops are left in the field, there is no direct profit to the farmer for harvested crop products. If improperly selected or managed, some cover crops can persist as weeds when the field is transitioned and prepared for subsequent plantings.


Sunday, November 13, 2022

Natural Gas Appliances and Global Warming

For several years the U.S. Department of Energy has been promoting the use of induction for home cooking. Conventional residential cooking tops typically use gas or resistance electric heating elements, (the ubiquitous coil) to heat food.  The government estimates that gas stoves are approximately 32% efficient in their energy use and electric stoves are 75-80% efficient.  Residential induction burners consist of an electromagnetic coil that creates a magnetic field when turned on. Compatible cookware is heated when it is within the magnetic according to the DOE induction cooking 85% efficient. Less heat is lost to the surrounding air, providing an additional energy efficiency benefit by reducing the workload for air conditioning equipment. A cooler cooking top surface also makes induction cook tops safer to work with than other types of cooking tops. Finally, because the cookware itself is the source of heat, it reaches desired temperatures more quickly and provides faster cook times.

I had always dreamed of a kitchen with a commercial or commercial style stove. When I had saved up the money to upgrade my kitchen, I realized that the kitchen centerpiece stove was not my best choice. First of all, it is a warming world and those stove throw off lots of heat, second I live in a rural area where natural gas (methane) is not available, instead we have a propane tank and third commercial stoves are simply not good at low simmer, my preferred cooking style. I make lots of sauces, gravy, stews and soups. Gas burners (especially propane with its three carbons) burn too hot. So, in 2018 when I updated my kitchen I installed an induction cook top. I have been amazingly happy with that choice. The cooking is all I had hoped. What I had not anticipated is how easy and fast it is to clean, and the bad kitty cannot accidentally turn it on.

Now scientists are taking a closer look at cooking with gas. Natural gas is a popular fuel choice for home cooking and has always been considered better than conventional electric. It has the reputation that “real cooks” use natural gas. Nationally, over 40 million homes (about a third) cook with gas. Natural gas appliances release methane and other pollutants through leaks and incomplete combustion. These appliances warm the planet in two ways: generating carbon dioxide by burning natural gas as a fuel and leaking unburned methane into the air. A recent Stanford University study found that the methane leaking from natural gas-burning stoves emit up to 1.3 % of the gas they use as unburned methane.

According to the U.S. EPA, methane is the second most prevalent greenhouse gas and accounted for about 10% of all U.S. greenhouse gas emissions from human activities. Methane is emitted by natural sources such as wetlands and the breakdown of organic material, as well as from leakage from natural gas systems, growing rice, waste disposal and the raising of livestock. Methane is a powerful greenhouse gas and is 25 times more effective than carbon dioxide at trapping heat over a 100-year period. While it does occur naturally, major human-generated sources include landfills, refineries, oil and gas fields, natural gas infrastructure, dairies and wastewater treatment plants.

This work came out of Dr. Jackson’s lab at Stanford University where they are working to measure and reduce greenhouse gas emissions through the Global Carbon Project (globalcarbonproject.org), which Jackson chairs. Some of their work is directly aimed at measuring and reducing methane emissions from oil and gas wells, city streets, and homes and buildings. According to Dr. Jackson and his colleagues, curbing methane emissions will require reducing fossil fuel use and controlling fugitive emissions such as leaks from pipelines and wells, as well as changes to the way we feed cattle, grow rice and eat. “We’ll need to eat less meat and reduce emissions associated with cattle and rice farming,” Dr. Jackson said, “and replace oil and natural gas in our cars and homes.”

The scientists measured methane and nitrogen oxides released in 53 homes in California- not the biggest of sample. Their sample group included 18 brands of gas cooktops and stoves ranging in age from 3 to 30 years old .Measurements were taken during combustion, ignition, extinguishment, and also while the appliance was off.  

The scientist found no relationship between the age or cost of a stove and its emissions. What they did find that more than three-quarters of methane emissions occurred while stoves were off, suggesting that gas fittings and connections to the stove and in-home gas lines are responsible for most emissions, regardless of how much the stove is used. They should have probably examined the age of the interior piping and fittings in the home, but that was not part of the study. California does not require a building permit when you replace gas appliances the way we do here. So the fittings in California are not tested regularly over time.

The scientists found the highest emitters were cooktops that used a pilot light instead of a built-in electronic sparker. Methane emissions from the puffs of gas emitted while igniting and extinguishing a burner were on average equivalent to the amount of unburned methane emitted during about 10 minutes of cooking with the burner.

Larger stoves (those trophy kitchen appliances )tended to emit higher rates of nitric oxides. The scientists estimated that people who don’t use their range hoods or who have poor ventilation can surpass the EPA’s guidelines for 1-hour exposure to nitrogen dioxide outdoors (there are no indoor standards) within a few minutes of stove usage, particularly in smaller kitchens.

Dr. Jackson encourages switching to electric stoves to cut greenhouse gas emissions and indoor air pollution. I switched to induction to get fabulous cooking,  easy cleanup and energy efficiency. I  maintain propane in my home to power my backup generator, a propane furnace, a gas fireplace (I'm thinking about it) and hot water heater. Without electricity I have no water-my well pump does not work, my air heat pumps do not work, and all my kitchen appliances and freezer go down. We have lost power for several days after a storm in the winter and once in the summer. Because I have the generator and  backup systems, my pipes did not burst, my septic pump continued to operate and life went on.

Wednesday, November 9, 2022

COP27 Opens in Egypt

 This week the 27th Conference of the Parties (COP27) opened its meeting in Sharm el-Sheikh, Egypt. The conference will run until November 18th 2022. The prospects for significant progress appear dim.

If you recall, in December 2015 at the 21st Conference of the Parties in Paris, Delegates from 196 countries reached an agreement that we all hoped put the nations on a course to reduce carbon dioxide emissions from the combustion of fossil fuel.

Under the Paris Agreement, every country agreed to work together to limit global warming to “well below 2 degrees” and aim for 1.5 degrees, to adapt to the impacts of a changing climate and to make money available to deliver on these aims to countries not able to afford the costs of adapting to a changing climate. The parties to the agreement committed to create national plans setting out how much they would reduce their emissions called Nationally Determined Contributions (NDC). Furthermore, they agreed that every five years they would come back with an updated plan that would reflect their highest possible ambition at that time.

The Covid-19  pandemic forced the delay of the COP 26 meeting and it was held last year in Glasgow, Scotland. However, only a limited number of countries and political organizations including the European Union, Japan, the UK and the United States submitted strengthened NDCs ahead of the Glasgow meeting.  Only 23 countries have submitted updated NDCs by the deadline for this meeting and that list includes only one major economy, Australia. Their NDCs now bring them in line with their peers.

The United States (by executive order and administrative action) has set a goal to reach 100% carbon-free electricity by 2035 and net zero emissions throughout the economy by 2050. The problem is that the reduction in emissions pledged so far are nowhere near sufficient to hold temperature change to 2 degrees Celsius according to the climate models. China in 2021 is the largest CO2 emitter at about 30% of the total- dwarfing the United States at 14%. China has only agreed to stop growing their CO2 emissions by 2030. The goals of the Paris Agreement cannot be met without reductions in China and the other nations still growing their emissions. Egypt's NDC submitted this year would increase their CO2 emissions 50% by 2030. 



Sadly, CO2 emissions from fuel have continued to grow year after year with the exceptions of a brief respite during the global financial crisis and the Covid-19 lockdowns. Now, European Countries have been buying coal to use for electricity generation to replace the natural gas unavailable due to the war in Ukraine, China is finally showing signs of opening up their economy and CO2 emissions are expected to resume their climb. Coal plants that were scheduled to shut down will continue to operate and several recently shut down coal fired turbines have be restarted. Coal fired electricity generation emits about twice the CO2 as natural gas. 

Prior to the Paris Agreement the world was heading for a 3.6 degree Celsius warming. The policies in place today would lead to a warming of about 2.7 degrees Celsius by 2100. If countries fully implement their NDC’s it would be around 2.4 degrees Celsius by 2100. The increase in extreme weather promised by the climate models appears to be in our future. 

Sunday, November 6, 2022

NASA Detects Methane Plumes

The below is from a NASA press release and a US EPA regulatorynotification:

The US Environmental Protection Agency, EPA, is currently in the final phases of developing it’s new  rule to reduce methane and other harmful air pollution from both new and existing sources in the oil and natural gas industry.

According to the EPA the oil and gas industry includes a wide range of operations and equipment, from wells to natural gas gathering lines and processing facilities, to storage tanks, and transmission and distribution pipelines that are a significant source of emissions of methane which is a potent greenhouse gas with a global warming potential more than 25 times that of carbon dioxide.

Methane is expected to be an area of focus at the next UN Climate Conference that begins next week in Egypt. In that goal the data that has been gathered by NASA may be very informative.

NASA’s Earth Surface Mineral Dust Source Investigation (EMIT) mission is mapping the prevalence of key minerals in the planet’s dust-producing deserts, but EMIT has demonstrated another crucial capability: detecting the presence of methane, a potent greenhouse gas.

“Reining in methane emissions is key to limiting global warming. This exciting new development will not only help researchers better pinpoint where methane leaks are coming from, but also provide insight on how they can be addressed – quickly,” said NASA Administrator Bill Nelson. “The International Space Station and NASA’s more than two dozen satellites and instruments in space have long been invaluable in determining changes to the Earth’s climate. EMIT is proving to be a critical tool in our toolbox to measure this potent greenhouse gas – and stop it at the source.”

Methane absorbs infrared light in a unique pattern – called a spectral fingerprint – that EMIT’s imaging spectrometer can discern with high accuracy and precision. The instrument can also measure carbon dioxide.

The new observations stem from the broad coverage of the planet afforded by the space station’s orbit, as well as from EMIT’s ability to scan swaths of Earth’s surface dozens of miles wide while resolving areas as small as a soccer field.

“These results are exceptional, and they demonstrate the value of pairing global-scale perspective with the resolution required to identify methane point sources, down to the facility scale,” said David Thompson, EMIT’s instrument scientist and a senior research scientist at NASA’s Jet Propulsion Laboratory in Southern California, which manages the mission. “It’s a unique capability that will raise the bar on efforts to attribute methane sources and mitigate emissions from human activities.”

In the data EMIT has collected since being installed on the International Space Station last July, the science team has identified more than 50 “super-emitters” in Central Asia, the Middle East, and the Southwestern United States. Super-emitters are facilities, equipment, and other infrastructure, typically in the fossil-fuel, waste, or agriculture sectors, that emit methane at high rates.

The mission’s study area coincides with known methane hotspots around the world, enabling researchers to look for the gas in those regions to test the capability of the imaging spectrometer.

For example, the instrument detected a plume about 2 miles (3.3 kilometers) long southeast of Carlsbad, New Mexico, in the Permian Basin. One of the largest oilfields in the world, the Permian spans parts of southeastern New Mexico and western Texas.

In Turkmenistan, EMIT identified 12 plumes from oil and gas infrastructure east of the Caspian Sea port city of Hazar. Blowing to the west, some plumes stretch more than 20 miles (32 kilometers).

EMIT team also identified a methane plume south of Tehran, Iran, at least 3 miles (4.8 kilometers) long, from a major waste-processing complex.

Scientists estimate flow rates of about 40,300 pounds (18,300 kilograms) per hour at the Permian site, 111,000 pounds (50,400 kilograms) per hour in total for the Turkmenistan sources, and 18,700 pounds (8,500 kilograms) per hour at the Iran site.

The Turkmenistan sources together have a similar flow rate to the 2015 Aliso Canyon, California gas leak, which exceeded 110,000 pounds (50,000 kilograms) per hour at times. The Los Angeles-area disaster was among the largest methane releases in U.S. history.

With wide, repeated coverage from its vantage point on the space station, EMIT will potentially find hundreds of super-emitters – some of them previously spotted through air-, space-, or ground-based measurement, and others that were unknown.

from NASA

“As it continues to survey the planet, EMIT will observe places in which no one thought to look for greenhouse-gas emitters before, and it will find plumes that no one expects,” said Robert Green, EMIT’s principal investigator at JPL.

EMIT is the first of a new class of spaceborne imaging spectrometers to study Earth. Carbon Plume Mapper (CPM), an instrument in development at JPL that’s designed to detect methane and carbon dioxide. JPL is working with a nonprofit, Carbon Mapper, along with other partners, to launch two satellites equipped with CPM in late 2023.

Wednesday, November 2, 2022

Prince William Digital Gateway CPA Approved


This morning just before 9 am and after a marathon all night meeting lasting 14 hours the Prince William County Board of County Supervisors voted to approve the Comprehensive Plan Amendment for the Prince William Digital Gateway. The vote was straight across party lines with the Democrats voting for industrial development in the rural area and Republicans voting against. (Supervisor Candland reclused himself and did not attend.)

This development in the northern portion of the Rural Crescent threatens the health of the Occoquan watershed and the very sustainability and affordability of the drinking water supply for Northern Virginia including 350,000 residents of Prince William County. When an undeveloped or generally open rural area is developed stormwater runoff increases in quantity and velocity washing away stream banks, flooding roads and buildings carrying fertilizers, oil and grease, and road salt to the Occoquan Reservoir.

The total amount of planned data center space exceeds existing data center square footage in Loudoun County (the data center capital of the nation and the world). It took Loudoun County 14 years to build out the existing data centers and Loudoun County still has approved data centers that have not yet been built. The majority of the 2,400 acres in the existing Data Center Overlay district are owned by data center development companies or directly by data center operators. This land has not yet been built.

The one growing sector of electricity demand in Virginia is data centers, and, wow, is that growing. In 2018 power demand for data centers was just over 1 gigawatt of power, by 2022 that had reached 2.6 gigawatt of power this past fall and is projected to double that in the next few years with projects already under way. 

It is clear that there is no limit to the desirability of data centers to county supervisors and landowners. The counties have been blinded by the windfall profits to the landowners and the prospect of increased tax revenue. They will more than double the number of data centers in all of Northern Virginia with this approval and this massive change in use will bring great wealth to the landowners- land that was worth $25,000-$50,000 is now magically be worth almost $1,000,000 an acre to be used for data centers, but these windfall profits come at the cost of degradation of our land and water resources and increased power and water costs for all Virginians. 

I offer my congratulations to Maryanne Gahaban on orchestrating the sale of the 194 parcels and 2,139 acres of rural and rural residential land for $2.1 billion dollars. Well played. Take your money and go.