This article's lead section contains information that is not included elsewhere in the article. (June 2024) |
Water pollution in the United States is a growing problem that became critical in the 19th century with the development of mechanized agriculture, mining, and industry, although laws and regulations introduced in the late 20th century have improved water quality in many water bodies.[1] Extensive industrialization and rapid urban growth exacerbated water pollution as a lack of regulation allowed for discharges of sewage, toxic chemicals, nutrients and other pollutants into surface water.[2][3]
In the early 20th century, communities began to install drinking water treatment systems, but control of the principal pollution sources—domestic sewage, industry, and agriculture—was not effectively addressed in the US until the later 20th century. These pollution sources can affect both groundwater and surface water. Multiple pollution incidents such as the Kingston Fossil Plant coal fly ash slurry spill (2008) and the Deepwater Horizon oil spill (2010) have left lasting impacts on water quality, ecosystems, and public health in the United States.[4][5] The United States Geological Survey reported that at least 45% of drinking water in the United States contains per- and polyfluoroalkyl substances (PFAS), commonly referred to as "forever chemicals."[6][7]
Many solutions to water pollution in the United States can be implemented to curtail water pollution. This includes municipal wastewater treatment, agricultural and industrial wastewater treatment, erosion and sediment control, and the control of urban runoff. The continued implementation of pollution prevention, control and treatment measures are used to pursue the goal of maintaining water quality within levels specified in federal and state regulations. However, many water bodies across the country continue to violate water quality standards in the 21st century.[8]