Head in a Cloud

troposphere and stratosphere meet blogosphere

Head in a Cloud random header image

Year 2100: Yup, Even Hotter!

September 27th, 2007 by jason english · 3 Comments

Editor’s note:  We are proud to begin presenting summaries of the presentations made at the weekly ATOC Journal Club.  Below is the first posting, by Jason English, based on a talk he gave yesterday.  We hope these postings will serve as a useful forum for post-Journal Club discussions.  Enjoy!  -Sean

In July, Chen et. al. published an article in JGR on the forcing of anthropogenic aerosols, tropospheric ozone, and long-lived greenhouse gases (GHG) in the year 2100 compared to year 2000. Here is a link to a talk on this paper. The authors used IPCC emissions scenario A2, which is a “worst case” scenario from an emissions standpoint, with the highest CO2 and SO2 emissions of any IPCC scenario. They compared year 2000 forcing to that of year 2100 and the topline conclusion could be described as “and you thought things were bad now!” The top-of-atmosphere (TOA) radiative forcing (RF) difference between year 2000 and 2100 for anthropogenic aerosols, tropospheric ozone, and GHG were +0.18 (direct effects only), +0.65, and +6.54 W m-2 respectively. Surface forcings were -3.02, +0.02, and +1.37 W m-2, respectively. This induced an equilibrium global annual mean surface temperature change of +0.14, +0.32, and +5.31 K respectively. The positive TOA RF for aerosols was due to a reduction in scattering aerosols (sulfates) and an increase in absorbing aerosols (mainly black carbon). However, all aerosols cause a strong cooling at the surface. Ozone and GHG, which do not interact with shortwave radiation but absorb longwave radiation, are expected to continue to rise, causing a positive TOA and surface forcing.

Additionally, the authors found that aerosols and GHG can induce changes to the global circulation, affecting regional temperature and precipitation patterns. The aerosol RF strengthened the Hadley Cell in DJF and weakened it in JJA, while GHG weaked the Hadley Cell year-round except in the ascending branch in the convection zone. Additionally, the strong aerosol RF in Asia caused a substantial surface cooling, the formation of a persistent high pressure system over the continent, and the reduction of evaporation and precipitation.

These results are consistent with the majority of the climate models in operation today – aerosols cause localized cooling and GHG cause global warming – and we can expect global surface temperatures to continue to rise. These results in particular suggest that under the A2 emissions scenario we can expect surface warming to get even more pronounced.

If these projections are accurate, humans have some major consequences ahead of us – heat waves, droughts, floods, rising sea levels, ecosystem habitat loss, invasive species, disease outbreaks, etc. The struggle I have with all this is that there remains some uncertainties in climate models, such as non-linear feedbacks, resolution limitations, convection parameterization, etc, which contribute to model uncertainties of an unknown magnitude. And these uncertainties are easily overlooked, as the standard deviations between model runs do not take into account these uncertainties. Not to mention that all models have been tuned to fit the past 100 years of climate, with no guarantee that these parameterizations can extrapolate far into the future. But on the other hand, we do have an enormous quantity of observational and modeling data which supports the conclusion that the earth has been warming the past 50 years, it is likely due to humans, and the earth should continue to warm, and could have disasterous consequences to the future survival of our species. So, if we wait until we are absolutely certain that the climate projections are correct, will it be too late to change our emissions (if it’s not too late already?)

Tags: aerosols · ATOC Journal Club · climate · global warming · modeling · troposphere

3 responses so far ↓

  • 1 Sean Davis // Sep 27, 2007 at 3:02 pm

    I’ve routinely heard as a contrarian talking point that climate models are “tuned to fit the past 100 years of climate” (not that I’m calling you a contrarian — on the contrary, I recognize that you’re not).

    I’m no climate modeler, but I’ve yet to see any evidence that climate models are somehow tuned so that they fit the last 100 years. I can buy that climate models use “real-world” data as input or to generate parameterizations (which is a good thing, btw!), but I’ve never seen any evidence that they are somehow magically tuned to, for example, fit the temperature record over the last century.

    Perhaps you could enlighten us on specifically how climate models are tuned? This is a very interesting issue that gets tossed around frequently as a way to essentially write off any projections of 21st century climate by models. My impressions is that it’s a simplification, at least, and disingenuous, at worst, to claim that GCMs are simply tuned to match the climate record.

    This would be a good thread in and of itself, …perhaps for someone elses journal club presentation! ;)

    PS. One good link is the response by Gavin Schmidt (a real climate modeler) over at RealClimate

  • 2 jason english // Sep 27, 2007 at 4:55 pm

    I agree that this could be a whole discussion by itself! My understanding is that there are some processes which are so complex (such as convection) such that there don’t yet exist observations or theories to sufficiently constrain the parameterizations. So this is what happens: a climate modeler creates a parameterization scheme, runs a simulation, and hopes that it matches some expected events from the 20th century (such as ENSO cycles). And this tuning continues until a parameterization scheme more closely matches the “expected results” (e.g. our understanding of past climate). I read your link to Gavin’s comment about parameterizations and it doesn’t appear to me that he is refuting any of this as he mentions the lack of contraint in some data which gives the modelers flexibility to modify the parameterizations.

    One reassuring thing about climate models that Gavin eludes to as well as the IPCC AR4 report is that all climate models project an increase in global surface temperatures, including simulations which use extremes of those parameterizations which are not yet well constrained.

  • 3 Roger A. Pielke Sr. // Oct 3, 2007 at 8:13 am

    Two and a half years ago I launched a weblog to discuss issues such as Jason has raised in his weblog. I offer two of the postings here that disucuss what the mult-decadal global climate models are capable of, and how their results should be communicated;

    http://climatesci.colorado.edu/2005/07/15/what-are-climate-models-what-do-they-do/

    http://climatesci.colorado.edu/index.php?s=hypotheses&submit=Search

    There are other examples on Climate Science.

    The bottom line is that we cannot be confident how the climate will evolve over the next century, as the models do not have all of the important climate forcings, much less all of the feedbacks. All that we can confidently state is that humans are altering the climate in diverse ways.

    With respect to “tuning” of models using past weather, this is not done explictly. However, since we have seen how the climate has changed over the last century in terms of selected climate metrics such as the global averaged surface temperature, the forcings (such as from aerosols) can be adjusted to closely match such metrics.

    However, the true (blind) test of the models is how they predict the evoluation of important climate metrics over the coming decade. In terms of “global warming”, for example, the appropriate metric is ocean heat content changes; see http://climatesci.colorado.edu/2007/04/04/a-litmus-test-for-global-warming-a-much-overdue-requirement/

Leave a Comment

*