"Give me ten parameters, and I'll simulate an elephant for you. Give me one more, and he'll wag his tail." The saying sums up the problem with many models. Models allow you demonstrate anything and everything, as long as there are enough knobs to turn. The real test of how good a model really is comes when you compare it to reality.
But when it comes to climate change, researchers are faced with a practically insoluble problem: We won't know for sure until the end of the century whether climate predictions for the year 2100 are correct or not. But with climate scientists around the world warning of the dangerous consequences of climate change, it becomes apparent that we can hardly afford to wait that long.
How do all these equations work together? The same way one would expect from physics class: Temperature goes up when warm air moves around or when rising pressure compresses air. Such changes in temperature in turn affect atmospheric pressure which then drives air masses. The Earth's rotation plays a role as well. This interplay between temperature, pressure and air circulation is reflected in the equations that drive climate prediction models.
But all these processes must be dramatically simplified for even supercomputers to produce results in a reasonable amount of time. "You have to limit yourself to the processes you think are most important," says Marco Giorgetta of the Max Planck Institute for Meteorology in Hamburg.
Because there's no way to simulate the path of a single CO2 molecule through the atmosphere, model makers create a relatively approximate, chessboard-like image of the Earth. The squares are hundreds of kilometres wide. The result is thousands of imaginary quadrants in the atmosphere, each of which represents a single grid point in climate models. Climate simulations calculate the predicted values for temperature and air movement at each grid point for intervals ranging between five and 20 minutes -- until they arrive at the year 2100.
Fortunately, climate researchers do have the ability to compare their data with reality. They can program their models, for example, to simulate the climate for the last 100 years and then compare the results with what really happened. Still, the process is not free of problems. First and foremost is the lack of data from a century ago -- there are only a few dozen exact measurements from 1900. For thousands of quadrants above the oceans, deserts and ice caps, there is no data whatsoever.
Moment of Equilibrium
Researchers can either estimate the values for temperature, wind and humidity, or they can use their computer models to establish broad parameters. To do so, they begin a climate simulation using a random value -- for example, 0 degrees Celsius for the entire globe. The simulation is then allowed to run until the computer arrives at a stable average annual temperature for each grid point. This moment of equilibrium is then used as a starting point.
The process is not free from possible pitfalls. Using a model to establish past climate events that match up well with reality can be something of a self-fulfilling prophecy -- and not necessarily one that creates reliable results for what the future might look like.
Further room for uncertainty is created by additional variables that must be programmed into the model during a process that researchers call parameterization. Clouds provide a good example. "A single cloud falls through the mesh of grid points, because they are in general much smaller than the distance between two grid points," says Mojib Latif from the Leibniz Institute of Marine Sciences in Kiel.
Given the approximate nature of the grid patterns used by the models, clouds are difficult to model, but they nonetheless play an important role in the global climate. "Clouds influence the transfer of radiation -- how much sunlight is reflected and how much is allowed to pass through," Giorgetta of Max Planck explains. To account for this, each grid point is assigned a value representing the degree of cloudiness for each point in time. It is a process without which climate models would not work -- and which some climate researchers find fault with.
As Accurately as Possible
A similar process which must be accounted for artificially is the friction created by air masses as they travel across the Earth's surface -- a process greatly influenced by topography. In creating parameters for their climate models, researchers use measurements whenever they can. "But there parameterizations that cannot be established using theory or observations," Giorgetta says. These parameters must be programmed as realistically as possible so that they reproduce past climate developments as accurately as possible.
The closer one looks at climate models, the greater the temptation to doubt their usefulness. Is this not a case of altering parameters until they produce the desired results? How much real science can be found in the models? How much is merely the result of tuning?
Still, climate experts believe they have the uncertainties in their simulations under control. "We check, among other things, how sensitive the models are to small changes in the parameterization," says Latif. In other words, the researchers tinker with the different dimensions to see whether the model still works just as well. "Too large a change in the simulation results would not be acceptable," Latif emphasizes.
Nevertheless, no serious scientist can guarantee the validity of the results. Climatologists long ago stopped giving concrete values for the predicted temperature in the year 2100. Nowadays they talk in terms of probabilities. For example, under emissions scenario A, it is 80 percent certain that the global average temperature will increase by at least 2 degrees Celsius.
In order to estimate the degree of uncertainty of climate forecasts, results based on different models, parameters and initial conditions are compared with each other. That then provides the range within which the expected value should lie.
But what happens if, in the future, geoscientists discover previously unknown relationships in the Earth's energy balance, or if significantly faster computers mean that more and more atmospheric processes can be modelled, rather than having to resort to simplifications? If those things happen, will the climate simulations suddenly produce different results?
Latif does not think it would make a difference. "Basically the most important processes have already been known for over 100 years." Nothing about that will change in the next 100 years either, he feels.
Stefan Rahmstorf of the Potsdam Institute for Climate Impact Research (PIK) takes a similar view. "The projections for the global mean temperature are very reliable, because they are determined within narrow limits by our planet's energy balance: incoming solar radiation minus the reflected component -- the so-called albedo -- and emitted thermal radiation." That energy balance is simple and well understood, he says.
Stay informed with our free news services:
|All news from SPIEGEL International||Twitter | RSS|
|All news from World section||RSS|
© SPIEGEL ONLINE 2009
All Rights Reserved
Reproduction only allowed with the permission of SPIEGELnet GmbH