|
|
- Use of generalized additive modelling techniques to create synthetic daily
temperature networks for benchmarking homogenization algorithms Authors: Killick R; Jolliffe I, Willett K. Abstract: AbstractBackgroundThe removal of non-climatic artefacts, inhomogeneities, from observed station data for the purpose of climate research is an ongoing task. Progress on homogenization algorithms is limited by a lack of suitable test data that are sufficiently realistic and where the ‘truth’ is known a priori.ObjectivesThis article describes a new method to create realistic, synthetic, daily temperature data, where the truth is known completely, thus allowing them to be used as a benchmark against which to test the performance of homogenization algorithms.MethodsObservations, reanalysis data and the power of statistical modelling, specifically gamma generalized additive models (GAMs), were combined to produce clean synthetic, daily mean temperature time series. The created clean data were corrupted with realistic inhomogeneities using both constant offsets and time-varying offsets produced by perturbing some of the GAM’s input variables. When assessing the created clean and corrupted data, particular focus was given to variability, inter-station correlations, temporal autocorrelations and extreme values.ResultsThis is the first daily benchmarking study at this scale, bringing improvements over monthly or annual studies as it is at the daily level that the extremes of climate are felt. The created inhomogeneities mimic real-world features, as identified from real-world data by the homogenization community. They take the form of both steps and trends and explore changes in the mean and variance. Clean and corrupted data are created for four regions in the USA: Wyoming, the North East, the South East and the South West. These four regions encompass a diverse range of climates, from a snow climate in the North East, to desert climates in the South West. Four test scenarios were created to allow the assessment of algorithm ability for different inhomogeneity and data structures. Scenario 1 was a best guess for the real world. The other three scenarios were constructed in ways that allowed the effects of station density, step versus trend inhomogeneities and varying temporal autocorrelation to be investigated.The closeness to reality of the created clean and corrupted data was assessed by comparisons with the observed data, noting that the observed data contain some level of both systematic and random error and are therefore not perfect themselves. Generally, the created clean data had higher interstation correlations in deseasonalized series (~0.9 versus ~0.7) and lower temporal autocorrelations in deseasonalized difference series (~0.01 versus ~0.10) than their real-world counterparts. The addition of inhomogeneities to create the corrupted data resulted in higher temporal autocorrelations in the deseasonalized difference series (~0.20 versus ~0.10) than those seen in the observed data. Despite these differences, the created levels of correlation are able to address issues of signal-to-noise ratio for detection algorithms, as in real-world data.ConclusionsThese created clean and corrupted data provide a valuable first daily surface temperature data set that can be used in homogenisation benchmarking studies. It is anticipated that they will serve as a baseline to be built upon in the future. PubDate: Tue, 25 Jun 2019 00:00:00 GMT DOI: 10.1093/climsys/dzz001 Issue No: Vol. 3, No. 1 (2019)
- Theoretical foundations of emergent constraints: relationships between
climate sensitivity and global temperature variability in conceptual models Authors: Williamson M; Cox P, Nijsse F. Abstract: AbstractBackgroundThe emergent constraint approach has received interest recently as a way of utilizing multi-model General Circulation Model (GCM) ensembles to identify relationships between observable variations of climate and future projections of climate change. These relationships, in combination with observations of the real climate system, can be used to infer an emergent constraint on the strength of that future projection in the real system. However, there is as yet no theoretical framework to guide the search for emergent constraints. As a result, there are significant risks that indiscriminate data-mining of the multidimensional outputs from GCMs could lead to spurious correlations and less than robust constraints on future changes. To mitigate against this risk, Cox et al. (Cox et al. Emergent constraint on equilibrium climate sensitivity from global temperature variability. Nature 2018a; 553: 319, hereafter CHW18) proposed a theory-motivated emergent constraint, using the one-box Hasselmann model to identify a linear relationship between equilibrium climate sensitivity (ECS) and a metric of global temperature variability involving both temperature standard deviation and autocorrelation (Ψ). A number of doubts have been raised about this approach, some concerning the application of the one-box model to understand relationships in complex GCMs, which are known to have more than the single characteristic timescale.ObjectivesTo study whether the linear Ψ–ECS proportionality in CHW18 is an artefact of the one-box model. More precisely, we ask ‘Does the linear Ψ–ECS relationship feature in the more complex and realistic two-box and diffusion models'’.MethodsWe solve the two-box and diffusion models to find relationships between ECS and Ψ. These models are forced continually with white noise parameterizing internal variability. The resulting analytical relations are essentially fluctuation–dissipation theorems.ResultsWe show that the linear Ψ–ECS proportionality in the one-box model is not generally true in the two-box and diffusion models. However, the linear proportionality is a very good approximation for parameter ranges applicable to the current state-of-the-art CMIP5 climate models. This is not obvious—due to structural differences between the conceptual models, their predictions also differ. For example, the two-box and diffusion, unlike the one-box model, can reproduce the long-term transient behaviour of the CMIP5 abrupt4xCO2 and 1pcCO2 simulations. Each of the conceptual models also predicts different power spectra with only the diffusion model’s pink 1/f spectrum being compatible with observations and GCMs. We also show that the theoretically predicted Ψ–ECS relationship exists in the piControl as well as historical CMIP5 experiments and that the differing gradients of the proportionality are inversely related to the effective forcing in that experiment.ConclusionsWe argue that emergent constraints should ideally be derived by such theory-driven hypothesis testing, in part to protect against spurious correlations from blind data-mining but mainly to aid understanding. In this approach, an underlying model is proposed, the model is used to predict a potential emergent relationship between an observable and an unknown future projection, and the hypothesized emergent relationship is tested against an ensemble of GCMs. PubDate: Mon, 18 Mar 2019 00:00:00 GMT DOI: 10.1093/climsys/dzy006 Issue No: Vol. 3, No. 1 (2019)
- Spatio-temporal patterns of daily Indian summer monsoon rainfall
Authors: Mitra A; Apte A, Govindarajan R, et al. Abstract: AbstractThe primary objective of this article is to analyse a set of canonical spatial patterns that approximate the daily rainfall across the Indian region, as identified in the companion article where we developed a discrete representation of the Indian summer monsoon rainfall using state variables with spatio-temporal coherence maintained using a Markov random field prior. In particular, we use these spatio-temporal patterns to study the variation of rainfall during the monsoon season. First, the 10 patterns are divided into three families of patterns distinguished by their total rainfall amount and geographic spread. These families are then used to establish ‘active’ and ‘break’ spells of the Indian monsoon at the all-India level. Subsequently, we characterize the behaviour of these patterns in time by estimating probabilities of transition from one pattern to another across days in a season. Patterns tend to be ‘sticky’: the self-transition is the most common. We also identify most commonly occurring sequences of patterns. This leads to a simple seasonal evolution model for the summer monsoon rainfall. The discrete representation introduced in the companion article also identifies typical temporal rainfall patterns for individual locations. This enables us to determine wet and dry spells at local and regional scales. Last, we specify sets of locations that tend to have such spells simultaneously and thus come up with a new regionalization of the landmass. PubDate: Mon, 11 Feb 2019 00:00:00 GMT DOI: 10.1093/climsys/dzy010 Issue No: Vol. 3, No. 1 (2019)
- Is the glacial climate scale invariant'
Authors: Mitsui T; Lenoir G, Crucifix M. Abstract: AbstractBackgroundPrevious estimates of the power spectrum and of the scaling exponent of the detrended fluctuation analysis of palaeoclimate time series yielded the suggestion that climate fluctuations are scale invariant over a wide range of time scales. Specifically, the last glacial period is characterised by Dansgaard-Oeschger events, with rapid and frequent transitions between stadial and interstadial regimes, and it was suggested that climate changes were then exhibiting multi-fractal dynamics.ObjectivesThe present contribution clarifies the interpretation of detrended fluctuation analysis and of power spectra during the last glacial period.MethodsWe use two simple models exhibiting regime behaviour reminiscent of Dansgaard-Oeshger dynamics: the random telegraph process with additive white Gaussian noise, and Stommel-Cessi’s box model of thermohaline circulation (a study of the Lorenz model is also proposed in the appendix). This analysis then provides a support for interpreting the generalized Hurst exponent h(q) and power-spectrum estimates of two NGRIP (Greenland) ice core records: the oxygen isotope ratio and the calcium ion concentration. We also analyse simulations with a stochastic dynamical system model with ice and astronmical forcings, which we recently proposed to simulate Dansgaard-Oeschger events.ResultsMultifractal detrended fluctuation analyses of time series generated by the toy models reveal that their generalized fluctuation functions have a local scaling regime. The generalized Hurst exponent h(q) is lower for q<<0 than for q>0. This dependency of h(q) is named here “apparent multifractality”. It occurs because the behaviour of the autocorrelation function of small fluctuations (within a regime) differs from that of large fluctuations (regime shifts). The generalized Hurst exponents of NGRIP records exhibit a form of apparent multifractality similar to that described in the toy models. The stochastic dynamical system model also captures both the power spectrum of the observations, and the behaviour of h(q).ConclusionsThe apparent multifractality of the Dansgaard-Oeschger events records is a consequence of regime switching between stadial and interstadial climates. Neither the local scaling in the power spectrum, nor the output of the multifractal detrended fluctuation analysis implies that the underlying process is scale invariant. PubDate: Sat, 05 Jan 2019 00:00:00 GMT DOI: 10.1093/climsys/dzy011 Issue No: Vol. 3, No. 1 (2019)
|