Climate models are more formally called General Circulation Models (GCMs). They are global in extent and simulate the climate with a fully interactive atmosphere and ocean. Here, we look in some detail at the type of facility needed to run a GCM. The example chosen is the European Centre for Medium-Range Weather Forecasts (ECMWF) in the UK.
Model experiments are performed on the ECMWF super-computers, a set of IBM shared-memory multi-processor machines, and the output is stored and disseminated from the ECMWF archiving systems which are based on the MARS architecture.
Logistics of running a model
A single run of a GCM cannot produce a definitive climate prediction. Because of uncertainty in the initial conditions, and simplifications of atmospheric physics in the model formulation, each run of the model can only produce a particular version of climate. This model output will have bias, or error, in certain areas. Experimentation shows that, if many runs are performed, using the same forcing but slightly different starting conditions, the average of the output from all runs is a more accurate reflection of observed climate than the output from any single run. Using output from a group of model runs also enables scientists to consider the probability of particular impacts of climate change. Groups of model runs performed in this way are described as an ensemble, and each model run is an ensemble member.
A typical model experiment might involve simulating climate over 350 years and take around three months to run a single ensemble member. For example, the many modeling groups contributing to the ENSEMBLES project will create data for a total of about 20,000 model years, requiring around 50 Tb of storage - equivalent to the disk capacity of about 500 laptops!