Authors:Dong Mao, Yang Wang, Qiang Wu Abstract: Advances in Adaptive Data Analysis, Volume 07, Issue 01n02, April 2015. In this paper, we developed a new approach for the analysis of physiological time series. An iterative convolution filter is used to decompose the time series into various components. Statistics of these components are extracted as features to characterize the mechanisms underlying the time series. Studies have shown that many normal physiological systems involve irregularity, while the decrease of irregularity usually implies abnormality. This motivates the use of the statistics for “outliers” in the components as features measuring irregularity. Support vector machines are used to select the most relevant features that are able to differentiate the time series from normal and abnormal systems. This new approach is successfully used in the study of congestive heart failure by heart beat interval time series. Citation: Advances in Adaptive Data Analysis PubDate: 2016-04-07T09:03:28Z DOI: 10.1142/S1793536915500016

Authors:Leonard J. Pietrafesa, Shaowu Bao, Tingzhuang Yan, Michael Slattery, Paul T. Gayes Abstract: Advances in Adaptive Data Analysis, Volume 07, Issue 01n02, April 2015. Significant portions of the United States (U.S.) property, commerce and ecosystem assets are located at or near the coast, making them vulnerable to sea level variability and change, especially relative rises. Although global mean sea level (MSL) and sea level rise (SLR) are fundamental considerations, regional mean sea level (RSL) variability along the boundaries of U.S. along the two ocean basins are critical, particularly if the amplitudes of seasonal to annual to inter-annual variability is high. Of interest is that the conventional wisdom of the U.S. agencies, the National Aeronautics and Space Administration (NASA) and the National Oceanic and Atmospheric Administration (NOAA) which both contend that the sources of sea level rise are related principally to heat absorption and release by the ocean(s) to the atmosphere and vice versa, and by Polar glacier melting and freshwater input into the ocean(s). While these phenomena are of great importance to SLR and sea level variability (SLV), we assess a suite of climate factors and the Gulf Stream, for evidence of correlations and thus possible influences; though causality is beyond the scope of this study. In this study, climate factors related to oceanic and atmospheric heat purveyors and reservoirs are analyzed and assessed for possible correlations with sea level variability and overall trends on actionable scales (localized as opposed to global scale). The results confirm that oceanic and atmospheric temperature variability and the disposition of heat accumulation or the lack thereof, are important players in sea level variability and rise, but also that the Atlantic Multi-Decadal Oscillation, the El Niño-Southern Oscillation, the Pacific Decadal Oscillation, the Arctic Oscillation, the Quasi-Biennial Oscillation, the North Atlantic Oscillation, Solar Irradiance, the Western Boundary Current-Gulf Stream, and other climate factors, can have strong correlative and perhaps even causal, modulating effects on the monthly to seasonal to annual to inter-annual to decadal to multi-decadal sea level variability at the community level. Citation: Advances in Adaptive Data Analysis PubDate: 2016-04-07T09:03:24Z DOI: 10.1142/S1793536915500053

Authors:Adam Huang, Min-Yin Liu, Wei-Te Yu Abstract: Advances in Adaptive Data Analysis, Volume 07, Issue 01n02, April 2015. We propose using a rolling ball algorithm, which moves a ball along a time series signal, to sort the local extrema within a signal according to a geometric tangibility criterion. Letting the ball always roll above or below the signal, we are able to classify the signal’s extrema according to their tangibility: touched or not touched by the ball. Applying this ball-tangibility information to select an extremum in the empirical mode decomposition (EMD) algorithm, we are able to prevent the mode mixing problem in analyzing intermittent signals and decompose mode functions satisfying bandpass filtering properties. Citation: Advances in Adaptive Data Analysis PubDate: 2016-04-07T09:03:22Z DOI: 10.1142/S179353691550003X

Authors:Stan Lipovetsky, Michael Conklin Abstract: Advances in Adaptive Data Analysis, Volume 07, Issue 01n02, April 2015. Maximum difference (MaxDiff) is a discrete choice modeling approach widely used in marketing research for finding utilities and preference probabilities among multiple alternatives. It can be seen as an extension of the paired comparison in Thurstone and Bradley–Terry techniques for the simultaneous presenting of three, four or more items to respondents. A respondent identifies the best and the worst ones, so the remaining are deemed intermediate by preference alternatives. Estimation of individual utilities is usually performed in a hierarchical Bayesian (HB)-multinomial-logit (MNL) modeling. MNL model can be reduced to a logit model by the data composed of two specially constructed design matrices of the prevalence from the best and the worst sides. The composed data can be of a large size which makes logistic modeling less precise and very consuming in computer time and memory. This paper describes how the results for utilities and choice probabilities can be obtained from the raw data, and instead of HB methods the empirical Bayes techniques can be applied. This approach enriches MaxDiff and is useful for estimations on large data sets. The results of analytical approach are compared with HB-MNL and several other techniques. Citation: Advances in Adaptive Data Analysis PubDate: 2016-04-07T09:03:19Z DOI: 10.1142/S1793536915500028

Authors:Gregori J. Clarke, Samuel S. P. Shen Abstract: Advances in Adaptive Data Analysis, Volume 07, Issue 01n02, April 2015. This study uses the Hilbert–Huang transform (HHT), a signal analysis method for nonlinear and non-stationary processes, to separate signals of varying frequencies in a nonlinear system governed by the Lorenz equations. Similar to the Fourier series expansion, HHT decomposes a data time series into a sum of intrinsic mode functions (IMFs) using empirical mode decomposition (EMD). Unlike an infinite number of Fourier series terms, the EMD always yields a finite number of IMFs, whose sum is equal to the original time series exactly. Using the HHT approach, the properties of the Lorenz attractor are interpreted in a time–frequency frame. This frame shows that: (i) the attractor is symmetric for [math] (i.e. invariant for [math]), even though the signs on [math] and [math] are changed; (ii) the attractor is sensitive to initial conditions even by a small perturbation, measured by the divergence of the trajectories over time; (iii) the Lorenz system goes through “windows” of chaos and periodicity; and (iv) at times, a system can be both chaotic and periodic for a given [math] value. IMFs are a finite collection of decomposed quasi-periodic signals, starting from the highest to lowest frequencies, providing detection of the lower frequency signals that may have otherwise been “hidden” by their higher frequency counterparts. EMD decomposes the original signal into a family of distinct IMF signals, the Hilbert spectra are a “family portrait” of time–frequency–amplitude interplay of all IMF members. Together with viewing the IMF energy, it is easy to discern where each IMF resides in the spectra in relation to one another. In this study, the majority of high amplitude signals appear at low frequencies, approximately 0.5–1.5. Although our numerical experiments are limited to only two specific cases, our HHT analyses of time–frequency, marginal spectra, and energy and quasi-periodicity of each IMF provide a novel approach to exploring the profound and phenomena-rich Lorenz system. Citation: Advances in Adaptive Data Analysis PubDate: 2016-04-07T09:03:14Z DOI: 10.1142/S1793536915500041