Since the beginning of the Industrial Revolution humans have released ~500 billion metric tons of carbon to the atmosphere through fossil-fuel burning, cement production and land-use changes. About 30% has been taken up by the oceans. The oceanic uptake of carbon dioxide leads to changes in marine carbonate chemistry resulting in a decrease of seawater pH and carbonate ion concentration, commonly referred to as ocean acidification. Ocean acidification is considered a major threat to calcifying organisms. Detecting its magnitude and impacts on regional scales requires accurate knowledge of the level of natural variability of surface ocean carbonate ion concentrations on seasonal to annual timescales and beyond. Ocean observations are severely limited with respect to providing reliable estimates of the signal-to-noise ratio of human-induced trends in carbonate chemistry against natural factors. Using three Earth system models we show that the current anthropogenic trend in ocean acidification already exceeds the level of natural variability by up to 30 times on regional scales. Furthermore, it is demonstrated that the current rates of ocean acidification at monitoring sites in the Atlantic and Pacific oceans exceed those experienced during the last glacial termination by two orders of magnitude.
Detecting regional anthropogenic trends in ocean acidification against natural variability
2012