Monday, September 30, 2013

IPCC Releases 5th Climate Report

Today the Intergovernmental Panel on Climate Change (IPCC) has finally released the much anticipated Fifth Assessment Report on climate change. The summaries that had been previously leaked to the traditional press and been reported in the newspapers (and the IPCC released an official summary on Friday) made the actual release of the report less news worthy. As widely reported the IPCC found that despite the current pause in the increase in surface temperature, the planet is warming and it is “extremely likely that the changes in our climate system for the past half a century are due to human influence.”

As the IPCC points out, confidence in the validity of this finding is based on the type, amount, quality, and consistency of evidence. The type of evidence used in the report were data collected in the past century, mechanistic understanding of the earth’s ecology, climate theory, mathematical models built to forecast the climate, and expert judgment. Observations of the climate system are based on direct measurements and remote sensing from satellites. Global-scale observations from the instrumental era began in the mid-19th century predominately for temperature. More comprehensive data has been collected since 1950s.

In their report the IPCC Working Group cites the overwhelming scientific consensus about the causes and results of climate change, but also calls for further assessments and projections, especially at regional level and in the oceans. Multiple lines of evidence confirm to the authors that the extra heat being trapped by greenhouse gases is warming the Earth’s atmosphere, surface, heating and acidifying the oceans, raising sea levels, and melting ice caps and glaciers. The report states the decade 2001-2010 was the warmest on record, a decade in which more temperature records were broken than any previous decade. However, the average surface temperature has actually not increased in the past decade.
from NOAA
For the period of 1970 to 2000 the median surface temperature as recorded by measurements increased 0.3 ± 0.04°F per year. However, there has been little further warming of the surface of the planet, particularly over the oceans in the most recent 10 to 15 years. The start of the current pause is difficult to pin down precisely. Although 1998 is often quoted as the start of the current pause, this was an exceptionally warm year because of the largest El Niño on record. This was followed by a strong La Niña event and a fall in global surface temperature of around 0.36°F which is equal to the average global warming for an entire decade based on the previous 30 years. It is only really since 2000 that the rise in global surface temperatures has paused. According to the Met Office Hadley Centre (MOHC) there is still substantial evidence from the other components of the climate system, beyond the global mean surface temperature, that the Earth has continued to warm over the last decade and this evaluation is supported by the scientific consensus in the IPCC Working Group.

Analysis of simulated natural climate variability by the MOHC using the climate models indicates that even with a long term warming rate of 0.3°F per decade, at least two periods without apparent temperature increase lasting a decade would be expected each century. The current pause in global surface temperature rise is not viewed as exceptional, based on those model simulations. The scientists cite two potential mechanisms to explain the recent pause; the first involves changes to the total energy received by the planet (radiative forcing), and the second involves the low frequency variability of the oceans and the way in which the oceans take up heat and store it below the surface in the deeper ocean. It is possible that a pause in surface warming could result from both mechanisms acting together though according to the MOHC radiative forcing by greenhouse gases has continued unabated; that heat is being held in the system but has not manifest as a rise in global mean surface temperature.

Observations of ocean heat content and of sea-level rise suggest that the additional heat has rather been absorbed in the oceans. Changes in the exchange of heat between the upper and deep ocean appear to have caused at least part of the pause in surface warming especially in the Pacific Ocean. In addition, the scientists point out the relative cooling influence of La Niña weather events. Since 2000 there have been no major El Niño events and indeed the tropical Pacific has been predominantly in La Niña states. These factors were not predicted in previous versions of the climate forecasting models. Thus, the IPCC cites the need for more research especially in the areas of deep sea temperatures.

IPCC and the scientific consensus have pivoted slightly in their focus to the oceans and sea level rise. The IPCC reports a Working Group consensus that ocean warming dominates the increase in energy stored in the earth’s climate system, accounting for more than 90% of the energy accumulated between 1971 and 2010. Since the early 1970s, glacier mass loss and ocean thermal expansion from warming together explain about 75% of the observed global mean sea level rise the rest is assembly attributed to the rise in sea levels that has been occurring unabated for over 10,000 years. “As the ocean warms, and glaciers and ice sheets reduce, global mean sea level will continue to rise, but at a faster rate than we have experienced over the past 40 years,” said Co-Chair Qin Dahe in the IPCC’s press release.

The IPCC Working Group expects global surface temperatures for the end of the 21st century to likely increase 2.7°F to 3.6°F relative to 1850 to 1900 time period. “Heat waves are very likely to occur more frequently and last longer. As the Earth warms, we expect to see currently wet regions receiving more rainfall, and dry regions receiving less, although there will be exceptions,” said Co-Chair Thomas Stocker in the press release. Dr. Stocker concluded his comments by reminding us that as a result of our past, present and expected future emissions of CO2, climate change is inevitable, and will persist for many centuries even if emissions of CO2 were to stop today.


Thursday, September 26, 2013

V’Ger has Left the Solar System

A 1977 NASA picture of Voyager 2 with it's payload
I am old enough to have watched Star Trek in its original run on Thursday nights on a black and white TV. (When the show moved to Fridays I could no longer watch it before DVRs.) My husband and his brother can actually have meaningful conversations by quoting dialogue from episodes to covey ideas. So, I was delighted that a newly published paper makes the case that NASA's Voyager 1 spacecraft might have already entered interstellar space, the space between stars.

Interstellar space is filled with plasma, or ionized gas, that has a lower temperature than what is inside our solar bubble, also known as the heliosphere. Interstellar space is reportedly 10,000 degrees Fahrenheit (6,000 Kelvin). The solar bubble, the heliosphere, has temperature of about 2 million degrees Fahrenheit (1 million Kelvin).

A group of NASA scientists have developed a new model to analyze data from the Voyager Space crafts. The model described in a recently published paper is new and different from other models used previously to explain the data the Voyager spacecraft have been sending back from more than 11 billion miles (18 billion kilometers) away from our sun and claims that on Aug. 25, 2012, Voyager 1 entered the depletion region, where the magnetic field acts as a kind of "magnetic highway" allowing energetic ions from inside the heliosphere to escape out, and cosmic rays from interstellar space to zoom in.

The data collected by the minimal equipment on Voyager measure the level of fast-moving charged particles, mainly protons, originating from far outside the heliosphere, the level of slower-moving charged particles, also mainly protons, from inside the heliosphere and the direction of the magnetic field. The level of outside particles has increased dramatically, the level of inside particles fallen precipitously, as scientists watched. Both models of heliosphere and intersteller space agree that the spacecraft is closing in on the edge of interstellar space. According to traditional models of interstellar space scientists would need to see a change in the direction of the magnetic field to confirm that the spacecraft has sailed beyond the reach of the solar wind and finally arrived into the vast space between stars. The new model of interstellar space and the heliosphere does not require a change in magnetic field between the two.
Artist creation
NASA's Voyager project scientist, Ed Stone of the California Institute of Technology in Pasadena, explains:
"Details of a new model have just been published that lead the scientists who created the model to argue that NASA's Voyager 1 spacecraft data can be consistent with entering interstellar space in 2012. In describing on a fine scale how magnetic field lines from the sun and magnetic field lines from interstellar space can connect to each other, they conclude Voyager 1 has been detecting the interstellar magnetic field since July 27, 2012.”

If the new model of the data is correct that would mean that the interstellar magnetic field direction is the same as the magnetic field within the heliosphere, originating from our sun. As Voyager 1 continues in its mission and hopefully continues to be able to collect and send back data (despite a memory smaller than your phone), we will know if the magnetic field does change directions. The direction of the magnetic field, requires periodic instrument calibrations and complicated analyses. These analyses typically take a few months to return after the charged particle data are received on Earth, and the power remaining in Voyager is estimated to be another decade.

NASA's Voyager 2 spacecraft, the first Voyager spacecraft to launch on August 25, 1977, departed on a journey that would make it the only spacecraft to visit Uranus and Neptune and the longest-operating NASA spacecraft ever. Voyager 2 and its twin, Voyager 1, that launched 16 days later on Sept. 5, 1977, are still both operational. Voyager 2 is the longest-operating spacecraft, but has not traveled as far from home as Voyager 1. Voyager 2 has not yet reached the magnetic highway, though it has recently seen some modest drops of the heliosphere particle levels.
from NASA
If Voyager 1 has reached interstellar space then it is exploring a region no spacecraft has ever been to before.- I can’t resist- “Where No Man Has Gone Before.” In Star Trek The Motion Picture, the threat to the Enterprise (and Earth) is V’Ger, which the crew discovers at its center is actually Voyager 6, designed to collect data and transmit it back to Earth. Voyager 6 supposedly disappeared through a black hole.(Wouldn’t that be cool to have data from within a black hole.) The probe was found by inhabitants of a planet on the other side of the galaxy who discovered the probe's 20th century programming, which was to collect data and return that information to its creator and rebuild the probe and send it on its way.

Monday, September 23, 2013

Carbon, Coal and Government Action

As expected the U.S. Environmental Protection Agency (EPA) last Friday once more proposed Clean Air Act standards to cut carbon pollution from new power plants. Under the new proposal, new large natural gas-fired turbines would need to meet a carbon dioxide (CO2) limit of 1,000 pounds of CO2 per megawatt-hour, while new small natural gas-fired turbines would need to meet a limit of 1,100 pounds of CO2 per megawatt-hour. New coal-fired units would need to meet a limit of 1,100 pounds of CO2 per megawatt-hour, and would have the option to meet a somewhat tighter limit if they choose to average emissions over multiple years. All existing plants and currently permitted and built in the next 12 months will be grandfathered and exempt from this new rule for a period of time.

This new proposal is a revision (and slight loosening) of the proposal made in March by the EPA. This is part of the President’s Climate Action Plan that directs all federal agencies to address climate change using existing executive authorities. The EPA is the lead regulator of the plan to cut carbon pollution. The Plan has three key pillars: cutting carbon pollution in the United States; preparing the country for the impacts of climate change; and leading international efforts to combat global climate change. Power plants are the largest concentrated source of emissions in the United States, accounting for roughly one-third of all domestic greenhouse gas emissions. While the United States has federal limits on arsenic, mercury and lead pollution that power plants can emit, currently, there are no national limits on the amount of carbon pollution power plants can emit. This is the first federal regulation to limit CO2.

Despite the progress being made on the carbon capture and sequester system that’s being developed by Alliant Techsystems and partner ACENT Laboratories to sequester carbon from coal fired power plants before it enters the atmosphere, the cost of capturing CO2 from power plants is currently too high for wide-scale implementation, and for now there will be no more coal fired power plants built. The continued existence of coal fired power plant will depend on developing more affordable technologies for carbon capture, utilization and storage (CCUS). The Department of Energy (DOE) Loan Programs Office is has launched a new loan guarantee program with $8 billion in loan guarantees available to develop new technologies.

The program seeks to make loans to advance the technology in three primary areas. Advanced Resource Development; projects that employ new or significantly improved technologies that avoid, reduce, or sequester air pollutants or greenhouse gas emissions from the development, recovery, and production of traditional and non-traditional fossil energy resources. Carbon Capture to selectively remove CO2 from process streams and flue gases, and produce a concentrated stream that can be compressed and transported to a permanent storage site. Third, because natural gas electricity generation produces a flue gas with low concentrations of CO2, and, therefore, making the adoption of carbon capture expensive and inefficient, the DOE is looking to finance the development of Low-Carbon Power Systems that utilize natural gas for electricity generation using novel processes or improved technologies that can integrate with CO2 storage or beneficial reuse. For now, the fuel of choice for all future power capacity additions will be natural gas, nuclear, or the renewable category (with government subsidies).

All federal agencies are required, “to assess both the costs and the benefits of intended regulation and, recognizing that some costs and benefits are difficult to quantify, propose or adopt a regulation only upon a reasoned determination that the benefits of the intended regulation justify its costs.” To justify the costs to the economy to implement the Presidents Climate Plan, the agencies use the “social cost of carbon” (SCC) to quantify the social benefits of reducing carbon dioxide (CO2) emissions into cost-benefit analyses of regulatory actions that in reality have only tiny impacts on cumulative global emissions. At this time in history the global emission of CO2 that are being driven by the growth in emission in the emerging markets of China and India.

The SCC is an estimate of the dollar damages associated with an incremental increase in carbon emissions in a given year to the global economy. It includes estimated changes in net agricultural productivity from changes in temperature and precipitation patterns, human health, and property damages from increased flood risk, loss of land from rising sea levels and the value of ecosystem services due to climate change. According to the National Academies of Science (NRC 2009) any assessment of the social cost of carbon will suffer from “uncertainty, speculation, and lack of information about (1) future emissions of greenhouse gases, (2) the effects of past and future emissions on the climate system, (3) the impact of changes in climate on the physical and biological environment, and (4) the translation of these environmental impacts into economic damages. As a result, any effort to quantify and monetize the harms associated with climate change will raise serious questions of science, economics, and ethics.”

Nonetheless, an interagency group made up of the EPA, the Departments of Agriculture, Commerce, Energy, Transportation, and Treasury with input from the Council on Environmental Quality, National Economic Council, Office of Energy and Climate Change, and Office of Science and Technology Policy selected three assessment models (IAMs) commonly used to estimate the SCC on future global gross domestic product (GDP) and assigned a range of costs for a metric ton of CO2 based on the percentage of the global economy that we represent. The models used were: the FUND, DICE, and PAGE models. These models are used in the Intergovernmental Panel on Climate Change (IPCC) assessments. The models produced a range of four estimated costs that were used for the social cost of carbon.

These models combine aspects of the climate change models, economic growth models, and feedbacks between the climate and the global economy into a single modeling framework, though there is only a limited amount of research linking climate impacts to economic damages. Underlying the models are a number of simplifying assumptions and judgments reflecting the various modelers’ best attempts to synthesize the available scientific and economic research and opinions characterizing these relationships and to translate global warming into damage estimates.

One of the most important factors influencing SCC estimates is the discount rate assumed. A large portion of climate change damages are expected to occur many decades into the future and the present value of those damages (the value at present of damages that occur in the future) is highly dependent on the discount rate assumed. Though there have been updates in the damages based on additional work relating to rising sea levels since the development of the SCC in 2010 the assumptions for the discount rate were not revisited. The SCC is actually a range consisting of four scenario estimates for the year 2020. In 2010 when interagency group first reported the SCC estimates, they were $7, $28, $44 and $86 per metric ton (2011$). This year the estimates were revised and the corresponding four scenario SCC estimates for 2020 were $13, $46, $69, and $137 per metric ton (2011$). The average SCC increased from $41 to $66 and increase of over 60% enabling a significantly larger positive benefit to be estimated from any carbon reducing regulation.

Thursday, September 19, 2013

The End of Coal May Not Be the Time of Methane

On Wednesday, Gina McCarthy, the U.S. Environmental Protection Agency Administrator, testified before the House Committee on Energy and Commerce’s Subcommittee on Energy and Power. Ms. McCarthy spoke about the EPA’s plans for the United States within the framework of the directions given to federal agencies last June saying: “The President’s Climate Action Plan directs federal agencies to address climate change using existing executive authorities. The Plan has three key pillars: cutting carbon pollution in America; preparing the country for the impacts of climate change; and leading international efforts to combat global climate change.”

The first steps of the President’s and EPA’s Climate program addressed motor vehicles, which emit nearly a third of U.S. carbon pollution. The EPA and the Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) issued new millage and emission standards for automobiles and light trucks for model year 2012 through 2016 that require vehicles to meet an estimated combined average emissions level of 250 grams of carbon dioxide (CO2) per mile in model year 2016, equivalent to 35.5 miles per gallon (mpg) if the automotive industry were to meet this CO2 level entirely through fuel economy improvements. A second set of standards requires continued improvement in gas mileage of about a 5% per year in average fuel economy from 2016 – 2025 that will result in car and light truck fuel economy to an average 56.2 miles per gallon by 2025.

After addressing automobiles, the President asked EPA to develop plans to reduce carbon pollution from future and existing power plants, which are responsible for about 40 % of America’s carbon dioxide emissions. This month EPA is expected to release the revised Carbon Pollution Standard for New Power Plants that had previously been announced 2012 and limits the amount of CO2 that can be produced for each megawatt of electricity produced. Under the revised rule, it is expected that new power plants will have to emit no more than 1,100 tons of carbon dioxide per megawatt-hour of energy produced. . That standard will effectively change the fuel of choice for all future power capacity additions to natural gas, nuclear, or the renewable category (with government subsidies). All existing plants and currently permitted and built in the next 12 months will be grandfathered and exempt from this new rule for a period of time. Reductions in CO2 generation from power plants will not improve human health, but the official “social costs” of carbon dioxide used by the EPA to $65 per ton.

EPA has also issued other regulations targeted at coal fired power plants, EPA’s Cross-State Air Pollution Rule, CSAPR, Mercury and Air Toxic's Standard, MATS and the lowering of the primary annual 2.5 micron particulate standard (PM 2.5) to 12. CSAPR which requires reductions of sulfur-dioxide and nitrogen-oxide emissions in coal fired plants was made final in July but at the end of last year, the U.S. Court of Appeals District of Columbia Circuit granted a stay to the implementation of the CSAPR pending resolution of the legal challenges. MATS regulates mercury, arsenic, acid gas, nickel, selenium, and cyanide and was finalized on December 21. 2011. All of these regulations are anticipated to have direct human health benefits in addition to reducing the ability of coal fired power plants to operate. There will be a reduction in the number of coal fired power plants and no new coal plants will be built. The 92% of the market for coal is domestic power plants. That market will shrink and wither and the age of coal will end.

The President’s Plan also calls for the development of a comprehensive, interagency strategy to address emissions of methane – a powerful greenhouse gas that also contributes to ozone pollution. So it remains unclear if regulations aimed at methane will reduce the feasibility of using our abundant natural gas resources as the primary fuel in power generation and for heating of commercial and residential buildings.

Even as EPA works to reduce carbon dioxide emissions in the United States, they are incorporating research on climate impacts into the implementation of their regulatory programs. According to Ms. McCarthy, EPA is working to build our national resilience to Climate Change, including developing the National Drought Resilience Partnership, ensuring the security of our freshwater supplies, protecting our water utilities, and protecting and restoring our forests in the fact of a changing climate. In addition, EPA will continue to engage in discussions with other nations to develop strategies for reducing carbon pollution through an array of activities.” These include public-private partnership efforts to address emissions of methane and other short-lived climate pollutants under the Climate and Clean Air Coalition and the Global Methane Initiative, as well as bilateral cooperation with major economies.”

Monday, September 16, 2013

SepticSmart Week



When homeowners flush and don’t think about their home’s septic system, it can lead to system back-ups and overflows, surfacing sewage in your yard which can be expensive to fix, polluted local waterways, and risks to public health and the environment. Nonetheless, Virginia like many states has struggled to try to get homeowners to properly maintain their septic systems, both conventional and alternative. Homeowners fail to see or simply ignore indications that their septic systems have failed, do not pump their tanks at appropriate intervals and do not comply with inspection and maintenance regulation for alternative systems. While the Virginia Department of Health (VDH) holds meetings and struggles for solutions, the U.S. Environmental Protection Agency (EPA) has launched the first annual SepticSmart Week, September 16-20, 2013 to encourage homeowners to get “SepticSmart.”

The United States has made tremendous advances in the past 35 years to clean up our rivers and streams under the Clean Water Act by controlling pollution from industry and sewage treatment plants. In order to continue to make progress in cleaning up our rivers and streams EPA has turned their focus to control pollution from diffuse, or nonpoint, sources. According to EPA, nonpoint source pollution remains the Nation's largest source of water quality problems. EPA has stated nonpoint source pollution as the reason 40% of our surveyed rivers, lakes, and estuaries are not clean enough to meet basic uses such as fishing or swimming. To continue to improve the quality of the surface and groundwater in the United States, the EPA has wants to expand its programs to include control and oversight of non-point sources of contamination and has used methods such as the Chesapeake Bay Totals Maximum Daily Load (TMDL) limits for sediment and the nutrients phosphorus and nitrogen.

Nonpoint source pollution occurs when rainfall, snowmelt, or irrigation runs over land or through the ground, picks up pollutants, nutrients, sediment and carries them to streams and on into rivers, lakes, and coastal waters or percolates into the ground and groundwater. Agriculture, forestry, grazing, septic systems, vehicles including cars, trucks, trains, boats, urban runoff, construction, physical changes to stream channels and land surface, and habitat degradation are potential sources of nonpoint source pollution. Careless or uninformed household management also contributes to nonpoint source pollution. Unfortunately, we did not do enough to control pollution from diffuse, or nonpoint, sources- from our homes and living.

Non-point source contamination has always been under the oversight of the states, and the nature of the sources of this contamination make it very challenging for even state and local regulatory agencies to make any progress. EPA has used the Chesapeake Bay TMDL to force the states to develop plans to manage and reduce nonpoint source pollution. In the past public and private groups have developed and used pollution prevention and pollution reduction initiatives. One example is the Soil and Water Conservation Districts that help educate citizens about their watershed and assist farmers in implementing best management practices and other nonpoint pollution controls using cost share dollars from the state and developing nutrient management plants. Nonetheless, more than environmental education activities seems necessary to get citizens to implement the best practices and low impact development strategies and control their own sources of nonpoint pollution starting with the most basic maintenance and care of their septic systems.

Simply pumping out your septic tank would be a good start at reducing nonpoint pollution, but homeowners just don’t do it. EPA and the Virginia Department of the Environment (through the VDH) have struggled with the challenges of better management of septic systems. There are more than 26 million septic systems in the United States, representing almost a quarter of all U.S. households. It is assumed that in Virginia a fairly rural state that at least a quarter of households use a septic system to treat their wastewater. Proper septic system care and maintenance is vital to protecting public health and preserving valuable water resources and the environment, but has been difficult to achieve.

In Virginia alternative septic systems, called AOSS, are regulated, but compliance with the regulations has been poor. The VDH has been holding stakeholder meetings to develop recommendations to increase homeowner and private sector participation in their program which requires an annual inspection of a system (by a licensed operator), regular maintenance and regular pumping of the tank.

Taking the steps recommended by the EPA for SepticSmart Week would be a great start at reducing nonpoint pollution of our waters. Homeowners can do their part by following these SepticSmart tips:
  1. Protect It and Inspect It: In general, homeowners should have their traditional septic system inspected every three years and their alternative system inspected annually by a licensed contractor and have their tank pumped when necessary, generally every three to five years. 
  2. Think at the Sink: Avoid pouring fats, grease, and solids down the drain, which can clog a system’s pipes and drainfield.
  3. Don’t Overload the Commode: Ask guests to only to put things in the drain or toilet that belong there. For example, coffee grounds, dental floss, disposable diapers and wipes, feminine hygiene products, cigarette butts, and cat litter can all clog and potentially damage septic systems. Flushable wipes are not flushable and do not break down in a septic tank.
  4. Don’t Strain Your Drain: Be water efficient and spread out water use. Fix plumbing leaks, install faucet aerators and water-efficient products, and spread out laundry and dishwasher loads throughout the day and week. Too much water at once can overload a system if it hasn’t been pumped recently. 
  5. Shield Your Field: Remind guests not to park or drive on a system’s drainfield, where the vehicle’s weight could damage buried pipes or disrupt underground flow.

Thursday, September 12, 2013

Climate Change or Weather

Last summer it was extremely hot and dry here in Prince William County, Virginia. It was the year my heat pump failed, others had wells go dry and the Interstate Commission on the Potomac River Basin (ICPRB) engaged a study for various climate scenarios of water supply availability from Potomac Watershed to determine if the water supply would be adequate to serve the population. This year is a different story. The summer has been cooler and wetter. Drought here is a distant memory. The summer is ending with only a couple of weeks this summer above 90 degrees and no days in triple digits and my garden is green.

The climate of the earth is constantly changing and the oceans rising for 10,000 years. Scientific studies and computer models have indicated that over the past century the earth has warmed 1.3°C. This warming is not particularly alarming in itself given our planetary history, but the speed of this temperature increase and the fact that the warming is projected to continue at an accelerated pace due to carbon dioxide concentrations in the atmosphere is worrisome. The planetary warming is forecast to cause sea levels to rise at an accelerated rate due to melting of sea ice in parts of the world, and changes in weather and patterns and precipitation. If carbon dioxide (CO2) concentrations in the atmosphere are the driving force in earth’s temperature the some portion of the weather extremes recently experienced are being caused by man.

According to the report “Explaining Extreme Events of 2012 from a Climate Perspective” released this week by the Bulletin of the American Meteorological Society, some of the extreme weather events of last year had mankind as one of the causes. Overall, 18 different research teams from around the world worked on the peer-reviewed report that examined the causes of 12 extreme weather events that occurred on five continents and in the Arctic during 2012. Hurricane Sandy slammed into the U.S. mid-Atlantic seaboard on October 29–30, 2012 causing widespread damage and devastating disruption to critical infrastructure. Hurricane Sandy broke 16 historical storm-tide levels along the East Coast though Sandy’s magnitude on the Saffir-Simpson hurricane wind scale was not particularly large; its westward strike heading was very unusual and it struck at high tide. Since 1851, nine other hurricanes (Category 1 and 2) have made landfall with similar proximity but all were heading north-northeastward. It was concluded that climate changes caused by man had no significant impact on that storm or the damage it caused . ($60.2 billion has been allocated by Congress to fund repair and mitigation measures.) However, the authors note that in the future rising sea levels could make smaller storms more likely to cause devastating damage.

Likewise, human-induced climate change was found to have had little impact on the lack of precipitation in the central United States in 2012 and continues in the current drought. However, in the section of the report titled, The Extreme March–May 2012 Warm Anomaly Over the Eastern United States by Thomas R. Knutson, Fanrong Zeng, and Andrew T. Wittenberg the authors found Approximately 35 %t of the extreme warmth experienced in the eastern U.S. between March and May 2012 can be attributed to human-induced climate change; and say high temperatures are now likely to occur four times as frequently due to human-induced climate change.

However the forecast is sensitive to the base period used and our assumptions of weather variability. The near-record Atlantic Ocean warmth off the east coast of the United States during March to May 2012 was annualized using a “multistep attribution” approach from Hegerl et al. (2009). This involves an assessment that attributes the observed change in seasonal mean temperature extremes to a change in climate and a separate assessment that attribute the change in climate and/or environmental conditions to external drivers and external factors. The observed trends in the figure below indicate that (according to the model-generated variability) the measured temperature extreme in 2012 were inconsistent with internal climate variability alone.
from Knutson et al
This was determined by using a control period of weather as a surrogate for the possible natural variability of temperatures. Since the heat wave of March-May 2012 occurred in a region with what the authors call “detectable long-term anthropogenic warming,” they concluded that anthropogenic forcing also likely contributed significantly to the observed temperatures in 2012. They state that a rough estimate of the anthropogenic contribution would be about 35% (based on the modeled value of ~1.3°C and the 2012 observed temperature anomaly of ~3.7°C). This 3.7°C event was 2.8 times stronger than the expected 1.3°C due to anthropogenic forcing in 2012. So, according to the authors weather variability played a substantial role.

The authors have simply assumed the 1.3°C portion of the anomaly is due to anthropogenic forcing as predicted by previous modeling of the climate. The estimate of the contribution of anthropogenic forcing to the observed weather variability are sensitive to two assumptions, the accuracy of 1.3°C increase in global temperatures numbers produced by models and the baseline period assumed by the authors. Here they used the period 1881–1920 as the baseline; if they used 1861–2012 as the baseline period, the risk of the event increases by about a factor of 5 rather than 12, and the portion of the temperature anomaly attributed to anthropogenic forcing would be 22%. If the average temperature increase were due to man were assumed to be lower, then the contribution of anthropogenic forcing would be less and vice versa.

The accuracy of climate models to regional variability is unclear. On a whole earth basis the climate models show at this point there is nothing that we can do to stop global warming and climate change. What is going to happen will happen.

Monday, September 9, 2013

Increasing the Efficiency of Solar Cells

from NREL

Another step has been taken towards making solar power a viable source of electricity in our future. North Carolina State University researchers have created a new technique for improving the overall efficiency of solar panels and solar accumulators. As documented in an article published September 5, 2013 in Applied Physics Letters, entitled, “Effect of GaAs interfacial layer on the performance of high bandgap tunnel junctions for multijunction solar cells,”, Joshua Samberg, Zachary Carlin, Geoff Bradshaw and Jeff Harmon and J.P. Allen all graduate students at North Carolina State University and Dr. Peter Colter, a research assistant professor of electrical engineering, and Dr. John Hauser, an emeritus professor of electrical engineering, have discovered that by inserting a very thin film of gallium arsenide into the connecting junction of stacked solar cells they can eliminate voltage loss without blocking any of the solar energy opening up the potential for vastly more efficient solar cells.

Back in 2002 it was discovered that it might be possible to create a photovoltaic cell sensitive to the full solar spectrum by stacking multiple negatively and positively doped layers to form several current-producing junctions. Multijunction devices or stacked solar cells use a high-bandgap top cell to absorb high-energy photons while allowing the lower-energy photons to pass through. A material with a slightly lower bandgap is then placed below the high-bandgap junction to absorb photons with slightly less energy (longer wavelengths).

The maximum theoretical efficiency that a single-bandgap solar cell can achieve with non-concentrated sunlight is about 33.5%, primarily because of the broad distribution of solar emitted photons. This limiting efficiency, known as the Shockley-Queisser limit, in part arises from the fact that the open-circuit voltage of a solar cell is limited by the bandgap of the absorbing material. Photons that have energies greater than the bandgap are absorbed and the excess energy is lost as heat or not converted if the photon is not perfectly matched to the bandgap energy of the absorbing semiconductor.

Multijunction or stacked devices use a high-bandgap top cell to absorb high-energy photons while allowing the lower-energy photons to pass through. A material with a slightly lower bandgap is then placed below the high-bandgap junction to absorb photons with slightly less energy (longer wavelengths). Typical multijunction cells use two or three absorbing layers, but the pattern of decreasing bandgaps could, in principle, be repeated to create many junctions. The theoretical maximum efficiency increases with the number of junctions, but the junctions loose energy limiting the practical application to three or so layers before the loss of energy becomes too large.

Three-junction devices using elements from the III and V columns of the Periodic table, such as gallium and germanium based semiconductors have reached efficiencies of greater than 44% using concentrated sunlight at 947 suns. This record was verified by the National Renewable Energy Laboratory, NREL. Now, the researchers at North Carolina State University we have created a connecting junction that loses almost no voltage, even when the stacked solar cell is exposed to 70,000 suns of solar energy.

The research at North Carolina State University was underwritten by the U.S. Department of Energy - Energy Efficiency and Renewable Energy SunShot Initiative which invests in multijunction (stacked) solar cell research and solar concentrating lens methods to be used with multijunction solar cells to achieve greater efficiency of solar cells and someday reduce the cost of solar generated power. The goal of the programs is to make solar power cost effective by 2020. The efficiency of the junction, not losing voltage when exposed to the power 70,000 suns of solar energy should be more than sufficient for practical purposes, since concentrating lens research currently underway indicates that they are unlikely to be able to create more than 4,000 or 5,000 suns worth of energy. This discovery at North Carolina State University means that solar cell manufacturers can now create multijunction stacked cells that can handle these high-intensity solar energies without losing voltage at the connecting junctions, thus increasing the layers and potentially improving conversion efficiency. However, the usefulness of this discovery will depend on cost.

In the past, stacked solar cells have primarily been used in space, where there is a premium placed on lightweight power generation, which allows for the use of this relatively high-cost solar technology. For terrestrial electrical generation, the high costs of these semiconductors compared to silicon semiconductor can be offset by using concentrating lenses to increase the power they are exposed to from one sun (no lens) to 4,000-5,000 suns or more. Increasing the amount of light incident on the solar cell, leads to more power production for the multijunction devices. Using concentrating lenses requires also using sun-tracking equipment to optimize the utilization time of the expensive cells, which must be factored into the cost of the system. Due to the land/area requirements of tracking systems and concentrating systems, the cost of the multijunction cells themselves, using multijunction solar devices will remain limited to large commercial or utility applications and are unlikely to be cost effective for a consumer application in my lifetime.

Nonetheless, the North Carolina State University finding is important because the two lines of research being supported by the Department of Energy and the National Science Foundation as part to their SunShot Initiative program is to utilize lenses to concentrate solar energy, to 4,000-5,000 suns and have efficient multijunction stacked cells that can efficiently utilize the concentrated solar power. Existing multijunction solar cells begin loosing voltage if the solar energy is concentrated above 700-1,000 suns, and the more intense the solar energy, the more voltage those junctions lose – thereby reducing the conversion efficiency. Utilizing the North Carolina State University discover of inserting a very thin film of gallium arsenide into the connecting junction of stacked solar cells they can eliminate voltage loss without blocking any of the solar energy and potentially achieve efficiencies beyond the current 43.3% for a multijunction architecture with the gallium arsenide in the connecting junction.
from DOE
The research was funded by the U.S. Department of Energy and the National Science Foundation and is your tax dollars at work.

Thursday, September 5, 2013

Tim Hugo Holds a North South Corridor Press Conference

On Wednesday, September 4th 2013 Delegate Tim Hugo (Virginia House of Delegates R-40th) along with Delegate Michael Webert (R-18th), Delegate Randy Minchew (R-10th), Delegate Bob Marshall ( R-13th) and Prince William County School Board Members Alyson Satterwhite and Gil Trenum held a news conference in front of Sudley Methodist Church on Sudley Road within the Manassas Battlefield. The news conference attended by about 75 community members and media was to announce that Tim Hugo along with Bob Marshall, Randy Minchew, Michael Webert and the other elected officials listed below have sent a letter to Governor McDonnell regarding the North/South Corridor Project also known as the Bi-County Parkway and asking for a meeting with the Governor. Delegate Hugo will personally deliver a copy of the letter today in Richmond.

The news conference was held ahead of today’s Virginia Department of Transportation (VDOT) and the Commonwealth Transportation Board (CBT) scheduled meeting. At the VDOT meeting they will discuss and potentially sign the National Historic Preservation Act, Section 106 Programmatic Agreement for the Bi-County Parkway which would provide $7 million to acquire the private land and design the Battlefield Park Bypass. In addition, once the Environmental Impact Statement (EIS) is finalized in the near future the Federal Highway Administration will sign the Record of Decision and VDOT will be free to begin the design phase of the Prince William County portion of the project with the $12 million already allocated to the project.

Though in public meetings it was emphasized that the roadway is 25 years away, as Delegates Hugo and Marshall pointed out the VDOT and CBT have made several misleading and conflicting claims and statements about this roadway. Meanwhile as Delegate Hugo pointed out opposition to the roadway grows, from a small group of activist residents lead by Mary Ann Ghadban, Philomena Hefter, and Page Snyder opposition to the Bi-County Parkway has grown. According to Delegate Hugo 700 attended the last Town hall meeting and group of supporters attended the news conference. A coalition has seemingly grown from a “diverse group who could not agree on the time of day but can agree that this is not the right project.”

In the news conference Delegate Hugo besieged the CBT, VDOT and the Governor to “Stop. Think. Slow down. Listen. This (Bi-County Parkway) is not the right project.” Delegate Randy Minchew followed Delegate Hugo and though Delegate Minchew (whose district covers sections of Loudoun, Clark and Fredrick counties) supported the Transportation Budget he wanted the take the time to make sure that the money is well spent. He, too, felt the process needed to slow down to get more input from the citizens of the Commonwealth.

Delegate Bob Marshall pointed out the false and misleading statements that have been made by VDOT and CBT about the Bi-County Parkway.  In addition, Delegate Marshall felt false claims have been made about the desirability and usefulness of the road to Dulles.

The Delegates felt that Governor McDonnell and Secretary of Transportation Sean Connaughton were rushing the Bi-County Project through to assure their legacy before they left office. According to the Delegates this would be the wrong legacy to leave. The constituents need to have a voice. The concerns raised need time to be addressed in a fiscally responsible manner. Though according to Delegate Marshall Secretary Connaughton said "most will go along with the road after they use this issue for campaigning,” the fight against the road can continue even if the National Historic Preservation Act, Section 106 Programmatic Agreement is signed today. Delegate Hugo does not think this is a partican issue (though the list below is all Republican); he called the opposition to the Bi-County Parkway a citizen issue. “If the Programmatic Agreement is signed we will continue the fight in January when the House of Delegates is in session.” The legislature can fight with budget amendments, legislation and by changing the composition of the Commonwealth Transportation Board.

In case you are new to the issue the North South Corridor or Bi-County Parkway will be a limited access highway approximately 45 miles in length running through what is now the Rural Crescent, predominantly agricultural and rural lands, and is essentially a more direct route for cargo and truck traffic connecting I-95 to Dulles Airport and Route 7. The only access points in Prince William County will be I-66, Route 29, and existing Route 234 west of the Battlefield. The new road will also be called 234 and be 2.5 miles west of Sudley Road and expanded to carry 4 lanes of traffic.
In addition to the $19 million mentioned above, funds have been allocated for “traffic calming” on route 29 through the park ahead of the development of any other roadway. VDOT has not explained what kinds of traffic calming measures would be used. In addition, though Sudley Methodist Church will maintain an access route and be eligible for signage, though their road will be closed to the public. Prince William County community objections to this planned parkway have focused on several issues that are still of concern to the community:

  1. The Bi-County Parkway will drive all the east-west traffic from route 29 which will be effectively closed to through traffic by "traffic calming measures" to I-66 increasing traffic on that road.
  2. The Bi-County Parkway is intended to be a 4 lane and 6 lane highway that will provide direct access to Dulles Airport, but have limited access to the Prince William community, yet will utilize a section of the Rural Crescent for the road essentially destroying the intent of the Rural Crescent.
  3. The planned road will require that Virginia invoke eminent domain to take land from more than a dozen homeowners. The Programmatic Agreement allocates $3 million to acquire the desired land.
  4. Route 234 through the Battlefield, providing road access to several businesses and Sudley Methodist Church (that predates the Civil War) will be eliminated. According to Reverend Mitchell, closing the road to through traffic will remove the Church from everyday lives of its members and potential members in the community, and effectively land lock and isolate the Church within the park to a slow death. This has happened to other churches.
  5. Closing route 234 through the park and route 29 through the park to through traffic essentially isolates northwestern Prince William County from the rest of the county and Manassas. There is no route from Heathcote Health Center to Prince William Hospital without going on I-66. The only route from Dominion Valley, Regency and all the development on route 15 to Manassas or anywhere will be I-66 which will be the only way to cross from western Prince William County to Eastern Prince William County.
  6. The Bi-County Parkway does nothing to improve east-west traffic, instead it provides connectivity to the airport that Prince William residents do not want, divides the county and eliminates connections within Prince William county and only benefits the Loudoun County developments.
  7. Delegate Hugo stated “If this road through groundwater in the Rural Crescent is severely threatened.” The route through Prince William County’s Rural Crescent potentially damages our watershed and water resources. The Rural Crescent provides a significant portion of our green infrastructure to our community. Maintaining intact, connected natural landscapes is essential for basic ecosystem and watershed preservation to ensure that there will always be clean air and water in Northern Virginia. The Northern Virginia Regional Commission (NVRC) has called the corridor one of three priority conservation area for the region.

Other signatories to the letter to the Governor are:
State Senator Dick Black (R-13th)
State Senator Richard Stuart (R-28th)
State Senator Jill Holtzman Vogel (R-27th)
Delegate Rich Anderson (R-51st)
Delegate Tim Hugo (R-40th)
Delegate Scott Lingamfelter (R-31st)
Delegate Bob Marshall (R-13th)
Delegate Randy Minchew (R-10th)
Delegate David Ramadan (R-87th)
Delegate Michael Webert (R-18th)
Prince William County Board Supervisor Maureen Caddigan (R-Potomac)
Prince William County Board Supervisor Pete Candland (R-Gainesville)
Loudoun County Board Supervisor Janet Clarke (R-Blue Ridge)

Monday, September 2, 2013

Salmonella Bacteria, Backyard Chickens and Safer Eggs

According to the Center for Disease Control and Prevention, CDC, salmonella bacteria causes about 1.4 million cases of foodborne illness each year. Salmonella bacteria in eggs account for 80% of the salmonella enteritidis infections. Symptoms—fever, diarrhea, abdominal cramps, and headache—can last a few days but is often lasts longer and can lead to severe complications or death. Salmonella causes more deaths than any other foodborne bacteria. The salmonella bacteria entered the chicken population in the last fifty or sixty years. Chickens harbor the salmonella bacteria without any signs of illness, making it impossible to know which animals are infected, and pass Salmonella along to both the yolk and white while the egg is forming in the ovaries and future generations of chickens. It is impossible to tell by appearance which eggs might be infected, and though in truth only a tiny proportion of eggs are infected, millions of eggs are consumed each day.

Federal regulations require pasteurization of raw liquid egg products used in commercially sold dishes such as ice cream, eggnog, sauces and ceasar dressings, but raw eggs sold in the shell to consumers are not required to be pasteurized. Less than 0.5% of all shell eggs produced for retail sale in the United State is pasteurized, according to the U.S. Department of Agriculture, USDA. I take food safety very seriously since I feed not only my family every day, but I cook and host holiday celebrations and prepare food for community and group gatherings. A family tradition (and specialty) is homemade eggnog. It is made with raw eggs. The alcohol does not kill the salmonella bacteria, despite what bar tenders tell you. So, for approaching 20 years I have made the eggnog and other recipes using Davidson’s Pasteurized eggs now called “Safety Eggs.” I have not used these eggs exclusively because I find the texture of the whites is not quite right. When you use them in some recipes they do not rise properly without extra whisking, though I have managed with the help of my Kitchen Aid Mixmaster to beat dozens of the egg whites to soft peaks year after year.
From Davidson's web site


Now, however, researchers at the Princeton Plasma Physics Laboratory (PPPL) and the U.S. Department of Agriculture (USDA) have developed a new technique and device for rapidly pasteurizing eggs in the shell reportedly without changing the texture of the egg white. The new method uses radio frequency (RF) energy to transmit heat through the shell and into the yolk while the egg rotates. While the yolk is heating cool water flows over the rotating egg to protect the white which is more sensitive to heat than the yolk. The RF energy creates an electric current that produces heat inside the egg. The egg is then bathed in hot water to pasteurize the white and finish pasteurizing the yolk.

The team lead by David Geveke from the USDA and Christopher Brunkhorst PPPL engineer, believes they have produced a pasteurized egg that is hardly discernible from a fresh, non-pasteurized egg. The USDA Agricultural Research Service in Wyndmoor, Pa. teamed up with PPPL engineer Christopher Brunkhorst, an expert in RF heating, to develop the method and device. The prototype can pasteurize shell eggs in about one-third of the time that current methods require. Current methods place the eggs in heated water for about an hour and change the consistency of the egg white. The RF process reportedly maintains the egg white's transparency and texture. The USDA has applied to patent the prototype design which, delivers RF energy through the shell by placing electrodes against opposite sides of the egg. The egg rests on rollers that turn it to distribute the cooling and heating water evenly.

Egg safety affects all of us. Remember, salmonella is not just a disease of commercial chickens. It's common these days for chickens, ducks, and other poultry to carry Salmonella. Live poultry kept as pets or to provide fresh eggs may harbor the salmonella bacteria. Salmonella bacteria live in the intestines of poultry and many other animals. Even organically fed poultry raised in a home setting can have Salmonella, because they were born with it. Live poultry can shed Salmonella bacteria in their droppings and from their bodies (feathers, feet, and beaks) even when they appear healthy and clean. The germs can also get on cages, coops, feed and water dishes, hay, plants, and soil in the area where the birds live and roam and can be picked up by children or adults who work or play where they live and roam. 
from CDC web site
Recently, a national outbreak of salmonella has been linked to an eastern New Mexico hatchery that sells live baby chickens, ducks and other poultry by mail and direct supply. New Mexico’s Department of Health said a strain of salmonella that's infected more than 300 people in 37 states was found in a duck pen at Privett Hatchery in Portales. According to the New Mexico Department of Health news release, salmonella infection is especially risky when parents keep the baby birds inside the house and allow their small children to handle and snuggle with them. Other cases can occur when parents don’t wash their hands properly after handling the birds, indirectly giving the infection to their children. Remember, these are backyard poultry.