Thursday, July 1, 2010

The Evolution of Water Treatment

Throughout history civilizations have risen where there is a reliable supply of drinking water and have failed or disappeared when the population exceeded the available water supply. The change in water supply could have happened through population growth, through changing climate or through exhaustive water mining, but in all cases the water supply became unreliable or vanished and the civilizations disappeared. Most early water supplies were from surface water that often was visibly cloudy. The Nile has been fed for millennia by the rich runoff from Ethiopia. The rains brought nutrients and minerals with the water which created the agricultural bounty of Egypt in wet years. One of the earliest water treatments was used by the Egyptian. Alum was added to the water to cause suspended particles to settle out of the drinking water. The Greeks used charcoal, sunlight and straining to improve the taste and appearance of their water. The earliest water treatments were methods to clarify water and improve its taste and appearance using these techniques.
During the 19th century, with the rise of the European city filtration began to be regularly used. However the true advances in water treatment came out of the advances in scientific understanding. John Snow was a brilliant English physician who during his short life was the first physicians to demonstrate using statistics the correlation between water quality and cholera cases in London. This took place before the creation of the germ theory. His work would be the basis for further exploration by Louis Pasteur, who disproved spontaneous generations and HH Robert Koch who finally proved that germs were the basis of disease.
Filtration and additives like alum are effective treatments for cloudy water or turbidity, but it has limited success in removing pathogens which cause diseases like typhoid, cholera, and dysentery. The discovery in the early 1900’s that chlorine and ozone were effective disinfectants for the treatment of water to eliminate pathogens were the beginning of the modern scientific era and the birth of the great nations. The first standards for bacteria in drinking water in the United States (1914) applied only to water carried on interstate boats and trains. The Public Health Service expanded water standards beginning in 1925 with the most rudimentary standards. This was expanded in 1946 and further expanded in 1962 to standards for 28 substances in drinking water. All fifty states adopted some version of the Public Health Service standards of 1962. However, in a landmark survey by the Public Health Service in 1969 found that only 60% of the water systems surveyed meet all 28 Public Health Service standards. Several more studies ensued and resulted in Congress passing the Safe Drinking Water Act of 1974. The SDWA was further amended in 1986 and 1996. Today there are almost 90 substances tested for and controlled under the SDWA.
Since the passage of the SDWA in 1974 the treatment of drinking water has increased most notably by small and medium community water systems. According to the EPA treatment by these smaller systems has more than doubled. Many of the treatment techniques used today by drinking water plants include methods that have been in use for hundreds of years. However, driven by the discovery of chlorine-resistant pathogens in drinking water that can cause hepatitis, gastroenteritis, cryptosporidiosis and others, newer technologies are being employed to maintain a safe water system. Reverse osmosis and activated carbon have increased in use and additional methods of water purification will no doubt be developed as new chemicals and substance find their way into water supplies through, runoff, industrial and waste treatment point source discharges and water recycling. As population centers strain their water supply we are entering the next age of drinking water in America.

No comments:

Post a Comment