The United States experienced the warmest July in its history, with more than 3,000 heat records broken across the country. Overall, the summer was the nation’s third warmest on record and comes in a year that is turning out to be the hottest ever. By Glen M. MacDonald October 4, 2012
Los Angeles Times OpEd
The United States experienced the warmest July in its history, with more than 3,000 heat records broken across the country. Overall, the summer was the nation’s third warmest on record and comes in a year that is turning out to be the hottest ever. High temperatures along with low precipitation generated drought conditions across 60% of the Lower 48 states, which affected 70% of the corn and soybean crop and rendered part of the Mississippi River nonnavigable. Arctic Sea ice declined to a record low, and a surface thaw swept across 97% of the Greenland ice cap.
Though it’s not possible to definitively link any of these individual events to human-caused climate change, the summer’s extreme weather follows clear longer-term trends and is consistent with climate model projections. This was the 36th consecutive July and 329th consecutive month in which global temperatures have been above the 20th century average. In addition, seven of the 10 hottest summers recorded in the United States have occurred since 2000. Such rising temperatures and climate anomalies have been documented around the world.
But there’s also one bit of good news: The increasingly powerful evidence of a long-term warming trend is making climate-change denial more difficult to defend. Take “Climategate” — the argument that scientists have based their evidence for global warming on fraudulent science. The Koch Foundation provided funding to physicist Richard Muller ofUC Berkeley, a longtime climate-change skeptic, to disprove the widespread consensus on global warming. Instead, his re-analysis showed the exact same warming trend found by the Intergovernmental Panel on Climate Change and other scientists.
Since completing his research last year, Muller has been vociferously speaking out on the reality of human-caused climate change, including in testimony before Congress. The publication this spring of an expanded weather station analysis by Britain’s Hadley Centre further confirms the trend and suggests Northern Hemisphere surface warming was about 0.1 degree Celsius greater than previously thought. With Muller’s and the Hadley Centre’s re-analysis, the idea of Climategate has become virtually impossible to take seriously. The planet is warming.
But that hasn’t silenced the climate-change deniers entirely; they’ve simply shifted their arguments. Increasingly, they are accepting evidence of recent warming, but they deny that it is largely caused by humans, attributing it instead to natural factors such as solar variability or the El Niño system. But these arguments don’t fly any better than their original ones.
Research by Grant Foster of the United States and Stefan Rahmstorf of Germany has shown that recent variations in the solar cycle, volcanic activity and El Niño/La Niña events actually had a tempering effect on warming. Similarly, Markus Huber and Reto Knutti of the Institute for Atmospheric and Climate Science in Zurich found by using simulation models that non-greenhouse gas factors could have accounted for only about 1% of the warming experienced since 1950. And this summer a team headed by Peter Gleckler of Lawrence Livermore National Laboratory provided strong evidence that the recent warming of the ocean surface could be traced to human activities. The evidence is now overwhelming that by and large the warming we are seeing has an anthropogenic cause.
Another common theme of the skeptics recently is that even if anthropogenic climate change is real, projections overstate future warming. Writing in August in the Wall Street Journal, physicists Roger Cohen (a retired ExxonMobil executive), William Happer of Princeton and Richard Lindzen of MIT — all noted climate skeptics — asserted that greenhouse gases, though possibly having a warming effect, were “unlikely to increase global temperature more than about one degree Celsius.”
That 1-degree Celsius, or 1.8-degree Fahrenheit, projection is based largely on a 2011 paper by Lindzen and contradicted by much other research. The Coupled Model Intercomparison Project Phase 5, for example, which represents about 20 climate modeling groups, has in 2012 generated more than 200 submissions and peer-reviewed publications testing and analyzing the newest climate models. The sum result of these improved models reaffirms the 2007 Intergovernmental Panel’s projections of an increase in global temperature of 4 to 8 degrees Fahrenheit by 2100. It is important for scientists to further refine such projections, but it’s clear that increasing greenhouse gases are likely to cause a significant rise in global temperatures.
Speaking recently on MSNBC, Sen. James M. Inhofe (R-Okla.) underscored what has fueled much of the skepticism aimed at climate science: “I thought it must be true,” he said, “until I found out what it cost.” It’s true that mitigation and adaptation will be costly. But inaction could carry even higher costs. Economists Frank Ackerman and Elizabeth Stanton calculated that putting off adaptation and mitigation efforts could cost the United States 1.36% of its gross domestic product by 2025, and 1.84% by 2100.
The question is no longer whether the climate will change because of increased greenhouse gases. Now we have to ask what we can do about it, and how much we can afford to spend. It’s crucial for scientists like me to provide dispassionate estimates of what the climate is doing now and will do in the future. But in the end, we won’t be the ones making the decisions about how best to deal with the warming and its consequences. This will require a broad public conversation and a well-informed public.
Glen M. MacDonald chairs UCLA‘s Institute of the Environment and Sustainability and is a professor of geography and of ecology and evolutionary biology.