Wednesday, November 30, 2022

New Rule Proposes that Federal Contactors Disclose GHG Emissions

On November 10th the White House announced that the Administration is proposing a new rule- the Federal Supplier Climate Risks and Resilience Rule. This rule would require major Federal contractors to publicly disclose their greenhouse gas emissions and climate-related financial risks and set science-based emissions reduction targets.

 Under the proposed rule, the suppliers and contractors to the Federal government receiving more than $50 million in annual contracts would be required to publicly disclose Scope 1, Scope 2, and relevant categories of Scope 3 emissions, disclose climate-related financial risks, and set science-based emissions reduction targets. Smaller Federal suppliers and contractors with more than $7.5 million but less than $50 million in annual contracts would be required to report Scope 1 and Scope 2 emissions. All Federal contractors with less than $7.5 million in annual contracts would be exempt from the rule. 

According to National Grid, Scope 1 and 2 emissions are direct and indirect emission by an organization. Those emissions that are owned or controlled by a company, whereas Scope 3 emissions are a consequence of the activities of the company but occur from sources not owned or controlled by it. An example of  Scope 1 emissions would be from burning fuel in their fleet of vehicles (if they’re not electrically-powered) or heating a building with a gas furnace.

Scope 2 emission are emissions that a company causes indirectly when the energy it purchases and uses is produced. For example, the emissions from the generation of the electricity they use to power buildings, electric cars and trucks and other electric powered equipment.

Scope 3 emissions encompasses emissions that are not produced by the company itself, and not the result of activities from assets owned or controlled by them, but by those that it’s indirectly responsible for, up and down its value chain. An example of this is when they buy, use and dispose of products from suppliers.

Each major contractor (those with more than $50 million in contracts) would have to publish an annual climate disclosure report that would include a qualitative disclosure of climate-related risks. In addition, each major contractor would also be required to publish science-based targets to reduce GHG emissions in line with what the latest science deems necessary to meet the goals of the Paris Agreement, as validated by a third party. It is unclear how a federal supplier of $50 million in goods or services is going to mitigate the GHG emissions in the grid, other than by buying carbon credits or what the costs associated with compliance will be.  

As UN Secretary-General, Mr. Guterres recently pointed out  the criteria for net-zero commitments can have loopholes wide enough to “drive a diesel truck through. At the recently ended COP-27 meeting Mr. Guterres said that net-zero pledges should be accompanied by a plan for how the transition is being made. “Management must be accountable for delivering on these pledges.” 

The Federal Acquisition Regulatory Council, which consists of the Department of Defense, the General Services Administration, the National Aeronautics and Space Administration, and is chaired by the Office of Federal Procurement Policy in the Office of Management and Budget, has issued this proposed rule under the Federal Acquisition Regulation (FAR). The FAR is the primary regulation for use by all executive agencies in their acquisition of supplies and services with appropriated funds.

Lets look at the Scope 1, 2 and 3 emissions disclosure for Amazon for the period 2019-2012. If you recall, Amazon was a co-creator of The Climate Pledge, a commitment to reach net-zero carbon by 2040—10 years ahead of the Paris Agreement. Since they created the Pledge in 2019, more than 300 companies have joined Amazon in making this commitment. Amazon appears to have no "science based" plan for meeting their commitment." 

With all of growth that Amazon has chased and despite purchasing carbon offsets and solar farms, Amazon’s carbon emission in 2021were 40% greater than they were in 2019. Though, their carbon intensity decreased by 18% over the same period. Carbon Intensity quantifies total carbon emissions, in grams of carbon dioxide equivalent (CO₂e), per dollar of gross merchandise sales. Amazon has not released a plan of how they will  meet their climate pledge. Rather, Amazon states: “As companies invest in new products and services, and their businesses grow substantially, the focus should not be solely on a company’s carbon footprint in terms of absolute carbon emissions, but also on whether it’s lowering its carbon intensity." 




Sunday, November 27, 2022

Solar Adoption and Solar Incentives

When a new consumer technology makes its debut, its adoption rate typically follows a predictable path. The first buyers come from a narrow slice of high-income users or tech enthusiasts who are willing to pay high prices. Over time, as prices fall and economies of scale kick in, sales climb sharply and the technologies become mass-market products. Eventually, the market becomes saturated, and the number of users reaches a plateau.

This pattern of adoption is what was expected for roof top solar photo voltaic arrays, but they have not performed that way. In a recent study published in Joule and led by Zhecheng Wang, a doctoral student in Stanford’s Department of Civil and Environmental Engineering, researchers examined the adoption of solar photo voltaic panels in the United States.

Previously, Stanford researchers had analyzed the number of solar installations at a single point in time. That work quantified that solar arrays were much less common in low-income communities, but it didn’t offer much insight into the pattern of adoption of this technology. From a public policy point of view it is important to understand the pattern of adoption of this technology.

To investigate why, the researchers at Stanford developed a computer model to interpret low image resolution in older satellite imagery to enable the researchers to identify the installation year of PVs from historical aerial and satellite images. The model which they named DeepSolar++  analyzed satellite images to identify where solar panels are and when they were installed in more than 400 counties across the United States. The researchers compiled images from 2006 through 2017 (a narrow time span for the adoption of an expensive and limited gratification and/or status technology) and then combined that data with information about each community’s demographics as well as local financial incentives for solar power

Their analysis showed that low-income communities are not only delayed in their adoption onset but also appear to saturate more quickly at lower adoption levels. Thought the time frame for study may simply be too narrow, to study the full life cycle of solar adoption, the researchers assumed it wasn’t and  examined the correlation of adoption to financial incentives.

Federal, state and local governments have long offered financial incentives, often in the form of rebates on income or property taxes. Performance-based incentives are much fewer. The Stanford researchers used a federal database of state incentives for renewable energy to identify which kinds of incentives were available in each community. This overlooked some local incentives; nonetheless, they found that only upper-income communities seemed to respond to tax incentives.

Their analysis of financial incentives offered on a state by state basis found that performance-based incentives (which reward customers based on how much solar they produce or how much less electricity they buy from the grid) are positively associated with saturated adoption levels for lower-income communities. Causality was not shown, only a correlation. They were unsure why performance-based incentives seem effective among lower-income communities.

from  Article

The researchers pointed out that lower-income families have much lower taxes and thus benefit less from tax both property and income tax breaks. People who rent rather than own their homes have no property taxes at all. The lead author, Ram Rajagopal, speculates that the less common performance-based type of incentive may motivate the owners of apartment buildings. Without financial incentives, installing solar photovoltaic panels to generate electricity is still more expensive than buying electricity from the grid. If you do not fully benefit from financial incentives, you do not install solar. This is simply rational behavior.

As Investopedia pointed out in their excellent analysis of the costs and benefits of solar power, it is capital intensive and the main cost of owning a system comes upfront when buying the equipment. The money has to be paid upfront for design, permits, solar panels, inverters, wiring, installation etc. In addition to installation costs, there are operating and maintenance costs for a photovoltaic solar array. Aside from cleaning the panels regularly, inverters generally need replacement after several years of use, rack systems fail (especially in snow which can lift up the racks) and roofs leak especially after mounting racks on them. In addition, there is great variability in solar production potential based on location and orientation of the roof and shading.


The cold hard number of my solar system

The financial benefits of solar are limited to the long term cost savings generated after the system pays for itself and before the roof needs to be replaced. This can be a short window of time and requires that the home owner live in the house for about 15 years to benefit fully. There are limited emotional benefits of owning solar- which are mostly private and have limited appeal. Unlike driving a Tesla or flashing your iPhone 14 -virtually no one sees my solar panels. I think the Stanford researchers who are clearly interested in public policy, should include a behavioral economist on their team. 

The Stanford study was funded by the U.S. Department of Energy, the National Science Foundation and a Stanford Precourt Pioneering Project award.

Tuesday, November 22, 2022

COP 27 Ends

The 27th Conference of the Parties (COP27) closed its meeting in Sharm el-Sheikh, Egypt with little if any substantial progress. After missing their Friday night deadline, negotiators were able to agree on a commitment to set up a financial support structure for the most vulnerable nations by the next COP in 2023

Yet, while agreement on these issues was welcomed as a step in the right direction, there appeared to be little forward movement on other key issues, particularly on the phasing out of fossil fuels, and tightened language on the need to limit global warming to 1.5 degrees Celsius. In fact, new language added at this meeting included “low emissions” energy alongside renewables as the energy sources of the future is a significant loophole. The undefined term could be used to justify new fossil fuel development against the clear guidance of the UN Intergovernmental Panel on Climate Change (IPCC) and the International Energy Agency (IEA).

The truth is the widow for limiting global temperature to 1.5 degrees Celsius is closing rapidly. CO2 emissions from fuel have continued to grow year after year with the exceptions of a brief respite during the global financial crisis and the Covid-19 lockdowns. Now, CO2 emissions are expected to resume their climb. Coal plants that were scheduled to shut down have continued to operate  and several recently shut down coal fired turbines have be restarted in Europe and China continues to build coal plants. Coal fired electricity generation emits about twice the CO2 as natural gas. 

While a growing number of governments and non-State actors are pledging to be carbon-free, the criteria for net-zero commitments can have loopholes wide enough to “drive a diesel truck through”, the UN Secretary-General, Mr. Guterres, decried as his expert group on the matter published its first report

That report slams greenwashing – misleading the public to believe that a company or entity is doing more to protect the environment than it is, and provides a roadmap to bring integrity to net-zero commitments by industry, financial institutions, cities and regions and to support a global, equitable transition to a sustainable future.

“Using bogus ‘net-zero’ pledges to cover up massive fossil fuel expansion is reprehensible. It is rank deception. This toxic cover-up could push our world over the climate cliff. The sham must end. Mr. Guterres said that net-zero pledges should be accompanied by a plan for how the transition is being made. “Management must be accountable for delivering on these pledges. 

As a member of the Prince William County Sustainability Commission, I look forward to the development of and implementation of the Community Energy and Sustainability Master Plan (CESMP), which will serve as a roadmap for the county to reach its climate mitigation and resiliency goals. 

Sunday, November 20, 2022

Recycled Water

All the water that was or ever will be on earth is here right now and is over 4 billion years old. More than 97% of the Earth’s water is within the in oceans. The remaining 2.8% of water is the fresh water that is within the land masses.  

Of that fresh water- 77%  is estimated to be in icecaps and glaciers (melting away as the planet warms) and the remaining fresh water is stored primarily in the subsurface as ground water. The tiny fraction of a percent of water that remains are the rivers and lakes that supplies the lion’s share of mankind’s needs. River and lakes are repeatedly renewed by rainfall. Rain drops fall to earth and evaporate, infiltrate into the soil, recharge groundwater or flow along the ground to a stream and ultimately flow into rivers and to the ocean-moving always moving.

 As average temperatures at the Earth’s surface rise, more evaporation occurs, which, in turn, should increase overall precipitation. Therefore, a warming climate is expected to increase precipitation in many areas, but not all.

According to NASA: “Current climate models indicate that rising temperatures will intensify the Earth’s water cycle, increasing evaporation. Increased evaporation will result in more frequent and intense storms; but will also contribute to drying over some land areas. As a result, storm-affected areas are likely to experience increases in precipitation and increased risk of flooding, while areas located far away from storm tracks are likely to experience less precipitation and increased risk of drought.

We are past the point where we can try to stop or reverse climate change and hope the climate will return to what it had been. Precipitation will not continue to fall in the patterns of the past. According to the U.S. EPA:  On average, total annual precipitation has increased over land areas in the United States. However, a few areas, such as the Southwest, have seen a decrease in precipitation.  

A mix of growing population, economic growth and changes in precipitation patterns has created a severe water shortage in California that could grow into a crisis in the near future. As drinking water sources become more scarce, California and other states look to directly recycle wastewater to drinking water. Potable reuse systems are up and running around the United States. The Orange County Water District has run the world’s largest water recycling plant since the 1970s. Water providers in Northern Virginia, Atlanta, Georgia, and Aurora, Colorado, also use potable reuse water as part of their drinking water supplies.

Potable reuse, the process of treating wastewater to drinkable standards, offers a reliable and sustainable solution to cities and regions facing shortages of clean water. The city of Los Angeles and water agencies across Southern California are looking into what’s known as “direct potable reuse,” which means putting purified recycled water directly back into our drinking water systems.  What was once called toilet to tap recycling. This differs from the indirect potable reuse we have here, where water spends time in our Occoquan reservoir.

Los Angeles plans to recycle all of its wastewater by 2035 and the California State Water Resources Control Board has been tasked by legislators with  developing a set of uniform regulations on direct potable reuse by Dec. 31, 2023. Meanwhile, a direct potable reuse demonstration facility near Griffith Park will serve to demonstrate the concept.

To accomplish that, the Hyperion Water Reclamation Plant — which like all wastewater treatment plants currently treats wastewater only to the level necessary for release— must be converted into an advanced water purification facility that produces water clean enough to drink.

To complement these plans, a group at Stanford University have be studying the quality of reclaimed water. They expected that potable reuse waters would be cleaner, in some cases, than conventional drinking water due to the fact that much more extensive treatment is conducted for them,” said Stanford professor William Mitch in a new study published in Nature Sustainability.  Toxicological assessment of potable reuse and conventional drinking waters | Nature Sustainability

The engineers found that, after treatment, potable reuse water is cleaner than conventional drinking water sourced from pristine-looking rivers. In most rivers, somewhere upstream is wastewater and stormwater releases  which have much less treatment than occurs in potable reuse systems.

Regulators demand more extensive treatment at potable reuse treatment plants. They specify that treatment systems must remove harmful pathogens, such as viruses and amoebas, and utilities flush out other contaminants using reverse osmosis, ozonation, biofiltration, and other cleaning techniques.

Reverse osmosis treatment pushes water at high pressure through a filter that's so small, it is the method of desalinization. Dr. Mitch and his colleagues discovered the process cleans wastewater as much if not more than groundwater, which is the gold standard. Even when reverse osmosis wasn’t applied, reuse waters were less toxic than the samples of conventional drinking waters sourced from rivers across the United States.

Conventional wastewater treatment plants just aren’t equipped to deep clean. This leaves many organic contaminants, such as chemicals from shampoos and medicines and trace contaminants from our manufactured products, floating down river and straight into a drinking water plant. Direct potable reuse will require building the water treatment plants to remove all these contaminants. Los Angeles is estimating $16 billion to implement these upgrades. However, California is not known for reliable cost estimates for big projects. 

Read the full study: Lau, S.S., Bokenkamp, K., Tecza, A. et al. Toxicological assessment of potable reuse and conventional drinking waters. Nat Sustain (2022). https://doi.org/10.1038/s41893-022-00985-7.

Wednesday, November 16, 2022

Cover Crops, no simple solution

Planting cover crops is a key tenet of conservation agriculture that involves planting non-cash crops on agricultural fields to provide soil cover between primary crop growing seasons. Cover crops primarily benefit future crops. They do this by reducing soil erosion and nitrogen runoff, crowd out weeds, control pests and diseases, increase biodiversity, and improve soil health soil health by helping to build soil carbon.

Building soil carbon serves also to reduce CO2 in theatmosphere. So, cover cropping was well funded under the United States Department of Agriculture's (USDA's) Environmental Quality Incentives Program that turned all agencies towards climate stewardship and has provided more than $100 million of incentives for cover crop adoption each year since 2016. An additional incentive of reduced insurance premiums was added through the Pandemic Cover Crop Program. 

Under these incentives, the total cropland area in the United States planted with cover crops in 2017 was  nearly 50% higher than reported in 2012 and has continued rising in the past five years. It sounds impressive until you realize that overall, in 2017 only about 5% of cropland  used cover crops.

Cover crops (grasses, legumes and forbs) recommended for seasonal cover and other conservation purposes include annual ryegrass, oilseed radish, winter cereal rye, and oats used for scavenging unused fertilizer and releasing nutrients back into the soil for the next crop to use. Good cover crops to break up compacted soils are forage radish  and forage turnip. Similar to commercial nitrogen (N) fertilizers, legume cover crops like crimson clover, hairy vetch and Austrian winter pea can provide some of the nitrogen needs of the primary crop.

Experimental field trials have often found slight yield losses for primary crops. However, these effects appear to vary considerably depending on many factors, including the agricultural region, the combination of cover and primary crop types, weather conditions, and management practices. Results from  field trial varied widely based on the type of cover crop, the level of fertilization, and the date of cover crop termination.

In a new study from Stanford scientists, examines yield loss by using data from actual farmer fields. They used satellite data to observe both the adoption of cover cropping and the yields of corn and soybeans throughout six states in the heart of the US Corn Belt. These observations, cover more than 90,000 fields, are then used in a algorithm developed by others to measure the incremental yield impact of adopting cover crops.

Using the satellite data they could determine the presence or absence of cover crops each year at field-level resolution. They used the previously published Scalable Crop Yield Mapper (SCYM) algorithm to forecast yield. The SCYM uses region-specific crop model simulations and weather to determine yields from satellite pixel data. Because they were using satellite data, their analysis could only represent the yield impacts of cover cropping as practiced in aggregate across the region.

The algorithm results indicated that fields where cover crops were adopted for 3 or more years experienced an average corn yield loss of 5.5%, compared with fields that did not practice cover cropping. The scientists also found on average, soybean yields were reduced by 3.5% following cover crop adoption. Nearly all locations appeared to experience negative effects. In general, impacts appeared most negative in Iowa and Northern Illinois compared with the rest of the study region. These areas were generally associated with better soil ratings, higher mid-season temperatures.

The scientists found greater yield losses for corn than soybean, which they felt was likely due to soybean's lower need for fertilizer nitrogen. They also found that corn yield impacts were significantly more negative on fields with a high soil productivity index (NCCPI). The scientists reasoned that those fields have higher yield potential, they accordingly have higher nitrogen needs to meet their yield potential.

Based on anecdotal observations in our own Prince William Soil and Water Conservation District.  “Small yield losses may be seen in certain situations,  in certain years, and the longer growers work with integrating covers in their systems the better they get at managing them thus reducing these losses.  The other thing they didn't look at was the economics.  Going no-till and using covers reduce fuel and fertilizer used.  Even though yield may be slightly reduced, profit may actually be better.” (Jay Yankey, former Manager PWSWC and current Farmer.)

There is on the ground research supporting the numerous benefits of introducing cover crops into a system, there are also challenges that growers may face in implementation or management. Cover cropping is different in different agricultural systems. Particularly in arid or drought-prone environments, the water needs by cover crops may cause a reduction in the amount available to the main crop, or require the use of supplemental irrigation.

In addition to potential increases in irrigation, there are other economic costs that must be considered. Expenditures for seed and soil preparations as well as labor requirements will change with the introduction of a cover crop. Because cover crops are left in the field, there is no direct profit to the farmer for harvested crop products. If improperly selected or managed, some cover crops can persist as weeds when the field is transitioned and prepared for subsequent plantings.


Sunday, November 13, 2022

Natural Gas Appliances and Global Warming

For several years the U.S. Department of Energy has been promoting the use of induction for home cooking. Conventional residential cooking tops typically use gas or resistance electric heating elements, (the ubiquitous coil) to heat food.  The government estimates that gas stoves are approximately 32% efficient in their energy use and electric stoves are 75-80% efficient.  Residential induction burners consist of an electromagnetic coil that creates a magnetic field when turned on. Compatible cookware is heated when it is within the magnetic according to the DOE induction cooking 85% efficient. Less heat is lost to the surrounding air, providing an additional energy efficiency benefit by reducing the workload for air conditioning equipment. A cooler cooking top surface also makes induction cook tops safer to work with than other types of cooking tops. Finally, because the cookware itself is the source of heat, it reaches desired temperatures more quickly and provides faster cook times.

I had always dreamed of a kitchen with a commercial or commercial style stove. When I had saved up the money to upgrade my kitchen, I realized that the kitchen centerpiece stove was not my best choice. First of all, it is a warming world and those stove throw off lots of heat, second I live in a rural area where natural gas (methane) is not available, instead we have a propane tank and third commercial stoves are simply not good at low simmer, my preferred cooking style. I make lots of sauces, gravy, stews and soups. Gas burners (especially propane with its three carbons) burn too hot. So, in 2018 when I updated my kitchen I installed an induction cook top. I have been amazingly happy with that choice. The cooking is all I had hoped. What I had not anticipated is how easy and fast it is to clean, and the bad kitty cannot accidentally turn it on.

Now scientists are taking a closer look at cooking with gas. Natural gas is a popular fuel choice for home cooking and has always been considered better than conventional electric. It has the reputation that “real cooks” use natural gas. Nationally, over 40 million homes (about a third) cook with gas. Natural gas appliances release methane and other pollutants through leaks and incomplete combustion. These appliances warm the planet in two ways: generating carbon dioxide by burning natural gas as a fuel and leaking unburned methane into the air. A recent Stanford University study found that the methane leaking from natural gas-burning stoves emit up to 1.3 % of the gas they use as unburned methane.

According to the U.S. EPA, methane is the second most prevalent greenhouse gas and accounted for about 10% of all U.S. greenhouse gas emissions from human activities. Methane is emitted by natural sources such as wetlands and the breakdown of organic material, as well as from leakage from natural gas systems, growing rice, waste disposal and the raising of livestock. Methane is a powerful greenhouse gas and is 25 times more effective than carbon dioxide at trapping heat over a 100-year period. While it does occur naturally, major human-generated sources include landfills, refineries, oil and gas fields, natural gas infrastructure, dairies and wastewater treatment plants.

This work came out of Dr. Jackson’s lab at Stanford University where they are working to measure and reduce greenhouse gas emissions through the Global Carbon Project (globalcarbonproject.org), which Jackson chairs. Some of their work is directly aimed at measuring and reducing methane emissions from oil and gas wells, city streets, and homes and buildings. According to Dr. Jackson and his colleagues, curbing methane emissions will require reducing fossil fuel use and controlling fugitive emissions such as leaks from pipelines and wells, as well as changes to the way we feed cattle, grow rice and eat. “We’ll need to eat less meat and reduce emissions associated with cattle and rice farming,” Dr. Jackson said, “and replace oil and natural gas in our cars and homes.”

The scientists measured methane and nitrogen oxides released in 53 homes in California- not the biggest of sample. Their sample group included 18 brands of gas cooktops and stoves ranging in age from 3 to 30 years old .Measurements were taken during combustion, ignition, extinguishment, and also while the appliance was off.  

The scientist found no relationship between the age or cost of a stove and its emissions. What they did find that more than three-quarters of methane emissions occurred while stoves were off, suggesting that gas fittings and connections to the stove and in-home gas lines are responsible for most emissions, regardless of how much the stove is used. They should have probably examined the age of the interior piping and fittings in the home, but that was not part of the study. California does not require a building permit when you replace gas appliances the way we do here. So the fittings in California are not tested regularly over time.

The scientists found the highest emitters were cooktops that used a pilot light instead of a built-in electronic sparker. Methane emissions from the puffs of gas emitted while igniting and extinguishing a burner were on average equivalent to the amount of unburned methane emitted during about 10 minutes of cooking with the burner.

Larger stoves (those trophy kitchen appliances )tended to emit higher rates of nitric oxides. The scientists estimated that people who don’t use their range hoods or who have poor ventilation can surpass the EPA’s guidelines for 1-hour exposure to nitrogen dioxide outdoors (there are no indoor standards) within a few minutes of stove usage, particularly in smaller kitchens.

Dr. Jackson encourages switching to electric stoves to cut greenhouse gas emissions and indoor air pollution. I switched to induction to get fabulous cooking,  easy cleanup and energy efficiency. I  maintain propane in my home to power my backup generator, a propane furnace, a gas fireplace (I'm thinking about it) and hot water heater. Without electricity I have no water-my well pump does not work, my air heat pumps do not work, and all my kitchen appliances and freezer go down. We have lost power for several days after a storm in the winter and once in the summer. Because I have the generator and  backup systems, my pipes did not burst, my septic pump continued to operate and life went on.

Wednesday, November 9, 2022

COP27 Opens in Egypt

 This week the 27th Conference of the Parties (COP27) opened its meeting in Sharm el-Sheikh, Egypt. The conference will run until November 18th 2022. The prospects for significant progress appear dim.

If you recall, in December 2015 at the 21st Conference of the Parties in Paris, Delegates from 196 countries reached an agreement that we all hoped put the nations on a course to reduce carbon dioxide emissions from the combustion of fossil fuel.

Under the Paris Agreement, every country agreed to work together to limit global warming to “well below 2 degrees” and aim for 1.5 degrees, to adapt to the impacts of a changing climate and to make money available to deliver on these aims to countries not able to afford the costs of adapting to a changing climate. The parties to the agreement committed to create national plans setting out how much they would reduce their emissions called Nationally Determined Contributions (NDC). Furthermore, they agreed that every five years they would come back with an updated plan that would reflect their highest possible ambition at that time.

The Covid-19  pandemic forced the delay of the COP 26 meeting and it was held last year in Glasgow, Scotland. However, only a limited number of countries and political organizations including the European Union, Japan, the UK and the United States submitted strengthened NDCs ahead of the Glasgow meeting.  Only 23 countries have submitted updated NDCs by the deadline for this meeting and that list includes only one major economy, Australia. Their NDCs now bring them in line with their peers.

The United States (by executive order and administrative action) has set a goal to reach 100% carbon-free electricity by 2035 and net zero emissions throughout the economy by 2050. The problem is that the reduction in emissions pledged so far are nowhere near sufficient to hold temperature change to 2 degrees Celsius according to the climate models. China in 2021 is the largest CO2 emitter at about 30% of the total- dwarfing the United States at 14%. China has only agreed to stop growing their CO2 emissions by 2030. The goals of the Paris Agreement cannot be met without reductions in China and the other nations still growing their emissions. Egypt's NDC submitted this year would increase their CO2 emissions 50% by 2030. 



Sadly, CO2 emissions from fuel have continued to grow year after year with the exceptions of a brief respite during the global financial crisis and the Covid-19 lockdowns. Now, European Countries have been buying coal to use for electricity generation to replace the natural gas unavailable due to the war in Ukraine, China is finally showing signs of opening up their economy and CO2 emissions are expected to resume their climb. Coal plants that were scheduled to shut down will continue to operate and several recently shut down coal fired turbines have be restarted. Coal fired electricity generation emits about twice the CO2 as natural gas. 

Prior to the Paris Agreement the world was heading for a 3.6 degree Celsius warming. The policies in place today would lead to a warming of about 2.7 degrees Celsius by 2100. If countries fully implement their NDC’s it would be around 2.4 degrees Celsius by 2100. The increase in extreme weather promised by the climate models appears to be in our future. 

Sunday, November 6, 2022

NASA Detects Methane Plumes

The below is from a NASA press release and a US EPA regulatorynotification:

The US Environmental Protection Agency, EPA, is currently in the final phases of developing it’s new  rule to reduce methane and other harmful air pollution from both new and existing sources in the oil and natural gas industry.

According to the EPA the oil and gas industry includes a wide range of operations and equipment, from wells to natural gas gathering lines and processing facilities, to storage tanks, and transmission and distribution pipelines that are a significant source of emissions of methane which is a potent greenhouse gas with a global warming potential more than 25 times that of carbon dioxide.

Methane is expected to be an area of focus at the next UN Climate Conference that begins next week in Egypt. In that goal the data that has been gathered by NASA may be very informative.

NASA’s Earth Surface Mineral Dust Source Investigation (EMIT) mission is mapping the prevalence of key minerals in the planet’s dust-producing deserts, but EMIT has demonstrated another crucial capability: detecting the presence of methane, a potent greenhouse gas.

“Reining in methane emissions is key to limiting global warming. This exciting new development will not only help researchers better pinpoint where methane leaks are coming from, but also provide insight on how they can be addressed – quickly,” said NASA Administrator Bill Nelson. “The International Space Station and NASA’s more than two dozen satellites and instruments in space have long been invaluable in determining changes to the Earth’s climate. EMIT is proving to be a critical tool in our toolbox to measure this potent greenhouse gas – and stop it at the source.”

Methane absorbs infrared light in a unique pattern – called a spectral fingerprint – that EMIT’s imaging spectrometer can discern with high accuracy and precision. The instrument can also measure carbon dioxide.

The new observations stem from the broad coverage of the planet afforded by the space station’s orbit, as well as from EMIT’s ability to scan swaths of Earth’s surface dozens of miles wide while resolving areas as small as a soccer field.

“These results are exceptional, and they demonstrate the value of pairing global-scale perspective with the resolution required to identify methane point sources, down to the facility scale,” said David Thompson, EMIT’s instrument scientist and a senior research scientist at NASA’s Jet Propulsion Laboratory in Southern California, which manages the mission. “It’s a unique capability that will raise the bar on efforts to attribute methane sources and mitigate emissions from human activities.”

In the data EMIT has collected since being installed on the International Space Station last July, the science team has identified more than 50 “super-emitters” in Central Asia, the Middle East, and the Southwestern United States. Super-emitters are facilities, equipment, and other infrastructure, typically in the fossil-fuel, waste, or agriculture sectors, that emit methane at high rates.

The mission’s study area coincides with known methane hotspots around the world, enabling researchers to look for the gas in those regions to test the capability of the imaging spectrometer.

For example, the instrument detected a plume about 2 miles (3.3 kilometers) long southeast of Carlsbad, New Mexico, in the Permian Basin. One of the largest oilfields in the world, the Permian spans parts of southeastern New Mexico and western Texas.

In Turkmenistan, EMIT identified 12 plumes from oil and gas infrastructure east of the Caspian Sea port city of Hazar. Blowing to the west, some plumes stretch more than 20 miles (32 kilometers).

EMIT team also identified a methane plume south of Tehran, Iran, at least 3 miles (4.8 kilometers) long, from a major waste-processing complex.

Scientists estimate flow rates of about 40,300 pounds (18,300 kilograms) per hour at the Permian site, 111,000 pounds (50,400 kilograms) per hour in total for the Turkmenistan sources, and 18,700 pounds (8,500 kilograms) per hour at the Iran site.

The Turkmenistan sources together have a similar flow rate to the 2015 Aliso Canyon, California gas leak, which exceeded 110,000 pounds (50,000 kilograms) per hour at times. The Los Angeles-area disaster was among the largest methane releases in U.S. history.

With wide, repeated coverage from its vantage point on the space station, EMIT will potentially find hundreds of super-emitters – some of them previously spotted through air-, space-, or ground-based measurement, and others that were unknown.

from NASA

“As it continues to survey the planet, EMIT will observe places in which no one thought to look for greenhouse-gas emitters before, and it will find plumes that no one expects,” said Robert Green, EMIT’s principal investigator at JPL.

EMIT is the first of a new class of spaceborne imaging spectrometers to study Earth. Carbon Plume Mapper (CPM), an instrument in development at JPL that’s designed to detect methane and carbon dioxide. JPL is working with a nonprofit, Carbon Mapper, along with other partners, to launch two satellites equipped with CPM in late 2023.

Wednesday, November 2, 2022

Prince William Digital Gateway CPA Approved


This morning just before 9 am and after a marathon all night meeting lasting 14 hours the Prince William County Board of County Supervisors voted to approve the Comprehensive Plan Amendment for the Prince William Digital Gateway. The vote was straight across party lines with the Democrats voting for industrial development in the rural area and Republicans voting against. (Supervisor Candland reclused himself and did not attend.)

This development in the northern portion of the Rural Crescent threatens the health of the Occoquan watershed and the very sustainability and affordability of the drinking water supply for Northern Virginia including 350,000 residents of Prince William County. When an undeveloped or generally open rural area is developed stormwater runoff increases in quantity and velocity washing away stream banks, flooding roads and buildings carrying fertilizers, oil and grease, and road salt to the Occoquan Reservoir.

The total amount of planned data center space exceeds existing data center square footage in Loudoun County (the data center capital of the nation and the world). It took Loudoun County 14 years to build out the existing data centers and Loudoun County still has approved data centers that have not yet been built. The majority of the 2,400 acres in the existing Data Center Overlay district are owned by data center development companies or directly by data center operators. This land has not yet been built.

The one growing sector of electricity demand in Virginia is data centers, and, wow, is that growing. In 2018 power demand for data centers was just over 1 gigawatt of power, by 2022 that had reached 2.6 gigawatt of power this past fall and is projected to double that in the next few years with projects already under way. 

It is clear that there is no limit to the desirability of data centers to county supervisors and landowners. The counties have been blinded by the windfall profits to the landowners and the prospect of increased tax revenue. They will more than double the number of data centers in all of Northern Virginia with this approval and this massive change in use will bring great wealth to the landowners- land that was worth $25,000-$50,000 is now magically be worth almost $1,000,000 an acre to be used for data centers, but these windfall profits come at the cost of degradation of our land and water resources and increased power and water costs for all Virginians. 

I offer my congratulations to Maryanne Gahaban on orchestrating the sale of the 194 parcels and 2,139 acres of rural and rural residential land for $2.1 billion dollars. Well played. Take your money and go.