A  B  C  D  E  F  G  H  I  J  K  L  M  N  O  P  Q  R  S  T  U  V  W  X  Y  Z  

              [Sort alphabetically]   [Restore default list]

  Subjects -> STATISTICS (Total: 130 journals)
Showing 1 - 151 of 151 Journals sorted by number of followers
Review of Economics and Statistics     Hybrid Journal   (Followers: 154)
Statistics in Medicine     Hybrid Journal   (Followers: 149)
Journal of Econometrics     Hybrid Journal   (Followers: 83)
Journal of the American Statistical Association     Full-text available via subscription   (Followers: 72, SJR: 3.746, CiteScore: 2)
Advances in Data Analysis and Classification     Hybrid Journal   (Followers: 53)
Biometrics     Hybrid Journal   (Followers: 52)
Sociological Methods & Research     Hybrid Journal   (Followers: 45)
Journal of the Royal Statistical Society, Series B (Statistical Methodology)     Hybrid Journal   (Followers: 41)
Journal of Business & Economic Statistics     Full-text available via subscription   (Followers: 40, SJR: 3.664, CiteScore: 2)
Journal of the Royal Statistical Society Series C (Applied Statistics)     Hybrid Journal   (Followers: 37)
Computational Statistics & Data Analysis     Hybrid Journal   (Followers: 35)
Oxford Bulletin of Economics and Statistics     Hybrid Journal   (Followers: 33)
Journal of Risk and Uncertainty     Hybrid Journal   (Followers: 33)
Statistical Methods in Medical Research     Hybrid Journal   (Followers: 30)
Journal of the Royal Statistical Society, Series A (Statistics in Society)     Hybrid Journal   (Followers: 28)
The American Statistician     Full-text available via subscription   (Followers: 26)
Journal of Urbanism: International Research on Placemaking and Urban Sustainability     Hybrid Journal   (Followers: 24)
Journal of Biopharmaceutical Statistics     Hybrid Journal   (Followers: 24)
Journal of Computational & Graphical Statistics     Full-text available via subscription   (Followers: 21)
Journal of Applied Statistics     Hybrid Journal   (Followers: 20)
Journal of Forecasting     Hybrid Journal   (Followers: 20)
British Journal of Mathematical and Statistical Psychology     Full-text available via subscription   (Followers: 18)
Statistical Modelling     Hybrid Journal   (Followers: 18)
International Journal of Quality, Statistics, and Reliability     Open Access   (Followers: 17)
Journal of Statistical Software     Open Access   (Followers: 16, SJR: 13.802, CiteScore: 16)
Journal of Time Series Analysis     Hybrid Journal   (Followers: 16)
Risk Management     Hybrid Journal   (Followers: 16)
Pharmaceutical Statistics     Hybrid Journal   (Followers: 15)
Computational Statistics     Hybrid Journal   (Followers: 15)
Statistics and Computing     Hybrid Journal   (Followers: 14)
Demographic Research     Open Access   (Followers: 14)
Statistics & Probability Letters     Hybrid Journal   (Followers: 13)
Decisions in Economics and Finance     Hybrid Journal   (Followers: 13)
Journal of Statistical Physics     Hybrid Journal   (Followers: 13)
International Statistical Review     Hybrid Journal   (Followers: 12)
Statistics: A Journal of Theoretical and Applied Statistics     Hybrid Journal   (Followers: 12)
Australian & New Zealand Journal of Statistics     Hybrid Journal   (Followers: 12)
Structural and Multidisciplinary Optimization     Hybrid Journal   (Followers: 12)
Geneva Papers on Risk and Insurance - Issues and Practice     Hybrid Journal   (Followers: 11)
Communications in Statistics - Theory and Methods     Hybrid Journal   (Followers: 11)
Advances in Complex Systems     Hybrid Journal   (Followers: 10)
Journal of Probability and Statistics     Open Access   (Followers: 10)
The Canadian Journal of Statistics / La Revue Canadienne de Statistique     Hybrid Journal   (Followers: 10)
Biometrical Journal     Hybrid Journal   (Followers: 9)
Communications in Statistics - Simulation and Computation     Hybrid Journal   (Followers: 9)
Scandinavian Journal of Statistics     Hybrid Journal   (Followers: 9)
Argumentation et analyse du discours     Open Access   (Followers: 8)
Asian Journal of Mathematics & Statistics     Open Access   (Followers: 8)
Fuzzy Optimization and Decision Making     Hybrid Journal   (Followers: 8)
Current Research in Biostatistics     Open Access   (Followers: 8)
Teaching Statistics     Hybrid Journal   (Followers: 8)
Stata Journal     Full-text available via subscription   (Followers: 8)
Multivariate Behavioral Research     Hybrid Journal   (Followers: 8)
Journal of Educational and Behavioral Statistics     Hybrid Journal   (Followers: 7)
Environmental and Ecological Statistics     Hybrid Journal   (Followers: 7)
Journal of Combinatorial Optimization     Hybrid Journal   (Followers: 7)
Handbook of Statistics     Full-text available via subscription   (Followers: 7)
Lifetime Data Analysis     Hybrid Journal   (Followers: 7)
Significance     Hybrid Journal   (Followers: 7)
Journal of Statistical Planning and Inference     Hybrid Journal   (Followers: 7)
Research Synthesis Methods     Hybrid Journal   (Followers: 7)
Queueing Systems     Hybrid Journal   (Followers: 7)
Journal of Mathematics and Statistics     Open Access   (Followers: 6)
Statistical Methods and Applications     Hybrid Journal   (Followers: 6)
Law, Probability and Risk     Hybrid Journal   (Followers: 6)
International Journal of Computational Economics and Econometrics     Hybrid Journal   (Followers: 6)
Journal of Global Optimization     Hybrid Journal   (Followers: 6)
Applied Categorical Structures     Hybrid Journal   (Followers: 6)
Journal of Nonparametric Statistics     Hybrid Journal   (Followers: 6)
Optimization Methods and Software     Hybrid Journal   (Followers: 5)
Engineering With Computers     Hybrid Journal   (Followers: 5)
CHANCE     Hybrid Journal   (Followers: 5)
Handbook of Numerical Analysis     Full-text available via subscription   (Followers: 4)
Metrika     Hybrid Journal   (Followers: 4)
ESAIM: Probability and Statistics     Open Access   (Followers: 4)
Mathematical Methods of Statistics     Hybrid Journal   (Followers: 4)
Statistical Papers     Hybrid Journal   (Followers: 4)
Sankhya A     Hybrid Journal   (Followers: 3)
Journal of Algebraic Combinatorics     Hybrid Journal   (Followers: 3)
Journal of Theoretical Probability     Hybrid Journal   (Followers: 3)
Journal of Statistical and Econometric Methods     Open Access   (Followers: 3)
Monthly Statistics of International Trade - Statistiques mensuelles du commerce international     Full-text available via subscription   (Followers: 3)
Statistical Inference for Stochastic Processes     Hybrid Journal   (Followers: 3)
Technology Innovations in Statistics Education (TISE)     Open Access   (Followers: 2)
AStA Advances in Statistical Analysis     Hybrid Journal   (Followers: 2)
IEA World Energy Statistics and Balances -     Full-text available via subscription   (Followers: 2)
Building Simulation     Hybrid Journal   (Followers: 2)
Stochastics An International Journal of Probability and Stochastic Processes: formerly Stochastics and Stochastics Reports     Hybrid Journal   (Followers: 2)
Stochastic Models     Hybrid Journal   (Followers: 2)
Optimization Letters     Hybrid Journal   (Followers: 2)
TEST     Hybrid Journal   (Followers: 2)
Extremes     Hybrid Journal   (Followers: 2)
International Journal of Stochastic Analysis     Open Access   (Followers: 2)
Statistica Neerlandica     Hybrid Journal   (Followers: 1)
Wiley Interdisciplinary Reviews - Computational Statistics     Hybrid Journal   (Followers: 1)
Measurement Interdisciplinary Research and Perspectives     Hybrid Journal   (Followers: 1)
Statistics and Economics     Open Access  
Review of Socionetwork Strategies     Hybrid Journal  
SourceOECD Measuring Globalisation Statistics - SourceOCDE Mesurer la mondialisation - Base de donnees statistiques     Full-text available via subscription  
Journal of the Korean Statistical Society     Hybrid Journal  
Sequential Analysis: Design Methods and Applications     Hybrid Journal  

              [Sort alphabetically]   [Restore default list]

Similar Journals
Journal Cover
Statistics and Computing
Journal Prestige (SJR): 2.545
Citation Impact (citeScore): 2
Number of Followers: 14  
 
  Hybrid Journal Hybrid journal (It can contain Open Access articles)
ISSN (Print) 1573-1375 - ISSN (Online) 0960-3174
Published by Springer-Verlag Homepage  [2469 journals]
  • Adaptive random neighbourhood informed Markov chain Monte Carlo for
           high-dimensional Bayesian variable selection

    • Free pre-print version: Loading...

      Abstract: Abstract We introduce a framework for efficient Markov chain Monte Carlo algorithms targeting discrete-valued high-dimensional distributions, such as posterior distributions in Bayesian variable selection problems. We show that many recently introduced algorithms, such as the locally informed sampler of Zanella (J Am Stat Assoc 115(530):852–865, 2020), the locally informed with thresholded proposal of Zhou et al. (Dimension-free mixing for high-dimensional Bayesian variable selection, 2021) and the adaptively scaled individual adaptation sampler of Griffin et al. (Biometrika 108(1):53–69, 2021), can be viewed as particular cases within the framework. We then describe a novel algorithm, the adaptive random neighbourhood informed sampler, which combines ideas from these existing approaches. We show using several examples of both real and simulated data-sets that a computationally efficient point-wise implementation (PARNI) provides more reliable inferences on a range of variable selection problems, particularly in the very large p setting.
      PubDate: 2022-09-30
       
  • Approximate Laplace importance sampling for the estimation of expected
           Shannon information gain in high-dimensional Bayesian design for nonlinear
           models

    • Free pre-print version: Loading...

      Abstract: Abstract One of the major challenges in Bayesian optimal design is to approximate the expected utility function in an accurate and computationally efficient manner. We focus on Shannon information gain, one of the most widely used utilities when the experimental goal is parameter inference. We compare the performance of various methods for approximating expected Shannon information gain in common nonlinear models from the statistics literature, with a particular emphasis on Laplace importance sampling (LIS) and approximate Laplace importance sampling (ALIS), a new method that aims to reduce the computational cost of LIS. Specifically, in order to centre the importance distributions LIS requires computation of the posterior mode for each of a large number of simulated possibilities for the response vector. ALIS substantially reduces the amount of numerical optimization that is required, in some cases eliminating all optimization, by centering the importance distributions on the data-generating parameter values wherever possible. Both methods are thoroughly compared with existing approximations including Double Loop Monte Carlo, nested importance sampling, and Laplace approximation. It is found that LIS and ALIS both give an efficient trade-off between mean squared error and computational cost for utility estimation, and ALIS can be up to 70% cheaper than LIS. Usually ALIS gives an approximation that is cheaper but less accurate than LIS, while still being efficient, giving a useful addition to the suite of efficient methods. However, we observed one case where ALIS is both cheaper and more accurate. In addition, for the first time we show that LIS and ALIS yield superior designs to existing methods in problems with large numbers of model parameters when combined with the approximate co-ordinate exchange algorithm for design optimization.
      PubDate: 2022-09-30
       
  • Spatially relaxed inference on high-dimensional linear models

    • Free pre-print version: Loading...

      Abstract: Abstract We consider the inference problem for high-dimensional linear models, when covariates have an underlying spatial organization reflected in their correlation. A typical example of such a setting is high-resolution imaging, in which neighboring pixels are usually very similar. Accurate point and confidence intervals estimation is not possible in this context with many more covariates than samples, furthermore with high correlation between covariates. This calls for a reformulation of the statistical inference problem that takes into account the underlying spatial structure: if covariates are locally correlated, it is acceptable to detect them up to a given spatial uncertainty. We thus propose to rely on the \(\delta \) -FWER, that is, the probability of making a false discovery at a distance greater than \(\delta \) from any true positive. With this target measure in mind, we study the properties of ensembled clustered inference algorithms which combine three techniques: spatially constrained clustering, statistical inference, and ensembling to aggregate several clustered inference solutions. We show that ensembled clustered inference algorithms control the \(\delta \) -FWER under standard assumptions, for \(\delta \) equal to the largest cluster diameter. We complement the theoretical analysis with empirical results, demonstrating accurate \(\delta \) -FWER control and decent power achieved by such inference algorithms.
      PubDate: 2022-09-30
       
  • Exact simulation of normal tempered stable processes of OU type with
           applications

    • Free pre-print version: Loading...

      Abstract: Abstract We study the Ornstein-Uhlenbeck process having a symmetric normal tempered stable stationary law and represent its transition distribution in terms of the sum of independent laws. In addition, we write the background driving Lévy process as the sum of two independent Lévy components. Accordingly, we can design two alternate algorithms for the simulation of the skeleton of the Ornstein-Uhlenbeck process. The solution based on the transition law turns out to be faster since it is based on a lower number of computational steps, as confirmed by extensive numerical experiments. We also calculate the characteristic function of the transition density which is instrumental for the application of the FFT-based method of Carr and Madan (J Comput Finance 2:61–73, 1999) to the pricing of a strip of call options written on markets whose price evolution is modeled by such an Ornstein-Uhlenbeck dynamics. This setting is indeed common for spot prices in the energy field. Finally, we show how to extend the range of applications to future markets.
      PubDate: 2022-09-27
       
  • Geometry-informed irreversible perturbations for accelerated convergence
           of Langevin dynamics

    • Free pre-print version: Loading...

      Abstract: Abstract We introduce a novel geometry-informed irreversible perturbation that accelerates convergence of the Langevin algorithm for Bayesian computation. It is well documented that there exist perturbations to the Langevin dynamics that preserve its invariant measure while accelerating its convergence. Irreversible perturbations and reversible perturbations (such as Riemannian manifold Langevin dynamics (RMLD)) have separately been shown to improve the performance of Langevin samplers. We consider these two perturbations simultaneously by presenting a novel form of irreversible perturbation for RMLD that is informed by the underlying geometry. Through numerical examples, we show that this new irreversible perturbation can improve estimation performance over irreversible perturbations that do not take the geometry into account. Moreover we demonstrate that irreversible perturbations generally can be implemented in conjunction with the stochastic gradient version of the Langevin algorithm. Lastly, while continuous-time irreversible perturbations cannot impair the performance of a Langevin estimator, the situation can sometimes be more complicated when discretization is considered. To this end, we describe a discrete-time example in which irreversibility increases both the bias and variance of the resulting estimator.
      PubDate: 2022-09-19
       
  • Mixture of multivariate Gaussian processes for classification of
           irregularly sampled satellite image time-series

    • Free pre-print version: Loading...

      Abstract: Abstract The classification of irregularly sampled Satellite image time-series (SITS) is investigated in this paper. A multivariate Gaussian process mixture model is proposed to address the irregular sampling, the multivariate nature of the time-series and the scalability to large data-sets. The spectral and temporal correlation is handled using a Kronecker structure on the covariance operator of the Gaussian process. The multivariate Gaussian process mixture model allows both for the classification of time-series and the imputation of missing values. Experimental results on simulated and real SITS data illustrate the importance of taking into account the spectral correlation to ensure a good behavior in terms of classification accuracy and reconstruction errors.
      PubDate: 2022-09-19
       
  • Sequential sampling of junction trees for decomposable graphs

    • Free pre-print version: Loading...

      Abstract: Abstract The junction-tree representation provides an attractive structural property for organising a decomposable graph. In this study, we present two novel stochastic algorithms, referred to as the junction-tree expander and junction-tree collapser, for sequential sampling of junction trees for decomposable graphs. We show that recursive application of the junction-tree expander, which expands incrementally the underlying graph with one vertex at a time, has full support on the space of junction trees for any given number of underlying vertices. On the other hand, the junction-tree collapser provides a complementary operation for removing vertices in the underlying decomposable graph of a junction tree, while maintaining the junction tree property. A direct application of the proposed algorithms is demonstrated in the setting of sequential Monte Carlo methods, designed for sampling from distributions on spaces of decomposable graphs. Numerical studies illustrate the utility of the proposed algorithms for combinatorial computations on decomposable graphs and junction trees. All the methods proposed in the paper are implemented in the Python library trilearn.
      PubDate: 2022-09-19
       
  • Achieving fairness with a simple ridge penalty

    • Free pre-print version: Loading...

      Abstract: Abstract In this paper, we present a general framework for estimating regression models subject to a user-defined level of fairness. We enforce fairness as a model selection step in which we choose the value of a ridge penalty to control the effect of sensitive attributes. We then estimate the parameters of the model conditional on the chosen penalty value. Our proposal is mathematically simple, with a solution that is partly in closed form and produces estimates of the regression coefficients that are intuitive to interpret as a function of the level of fairness. Furthermore, it is easily extended to generalised linear models, kernelised regression models and other penalties, and it can accommodate multiple definitions of fairness. We compare our approach with the regression model from Komiyama et al. (in: Proceedings of machine learning research. 35th international conference on machine learning (ICML), vol 80, pp 2737–2746, 2018), which implements a provably optimal linear regression model and with the fair models from Zafar et al. (J Mach Learn Res 20:1–42, 2019). We evaluate these approaches empirically on six different data sets, and we find that our proposal provides better goodness of fit and better predictive accuracy for the same level of fairness. In addition, we highlight a source of bias in the original experimental evaluation in Komiyama et al. (in: Proceedings of machine learning research. 35th international conference on machine learning (ICML), vol 80, pp 2737–2746, 2018).
      PubDate: 2022-09-18
       
  • Default risk prediction and feature extraction using a penalized deep
           neural network

    • Free pre-print version: Loading...

      Abstract: Abstract Online peer-to-peer lending platforms provide loans directly from lenders to borrowers without passing through traditional financial institutions. For lenders on these platforms to avoid loss, it is crucial that they accurately assess default risk so that they can make appropriate decisions. In this study, we develop a penalized deep learning model to predict default risk based on survival data. As opposed to simply predicting whether default will occur, we focus on predicting the probability of default over time. Moreover, by adding an additional one-to-one layer in the neural network, we achieve feature selection and estimation simultaneously by incorporating an \(L_1\) -penalty into the objective function. The minibatch gradient descent algorithm makes it possible to handle massive data. An analysis of a real-world loan data and simulations demonstrate the model’s competitive practical performance, which suggests favorable potential applications in peer-to-peer lending platforms.
      PubDate: 2022-09-15
       
  • Quantile regression feature selection and estimation with grouped
           variables using Huber approximation

    • Free pre-print version: Loading...

      Abstract: Abstract This paper considers model selection and estimation for quantile regression with a known group structure in the predictors. For the median case the model is estimated by minimizing a penalized objective function with Huber loss and the group lasso penalty. While, for other quantiles an M-quantile approach, an asymmetric version of Huber loss, is used which approximates the standard quantile loss function. This approximation allows for efficient implementation of algorithms which rely on a differentiable loss function. Rates of convergence are provided which demonstrate the potential advantages of using the group penalty and that bias from the Huber-type approximation vanishes asymptotically. An efficient algorithm is discussed, which provides fast and accurate estimation for quantile regression models. Simulation and empirical results are provided to demonstrate the effectiveness of the proposed algorithm and support the theoretical results.
      PubDate: 2022-09-11
       
  • Kryging: geostatistical analysis of large-scale datasets using Krylov
           subspace methods

    • Free pre-print version: Loading...

      Abstract: Abstract Analyzing massive spatial datasets using a Gaussian process model poses computational challenges. This is a problem prevailing heavily in applications such as environmental modeling, ecology, forestry and environmental health. We present a novel approximate inference methodology that uses profile likelihood and Krylov subspace methods to estimate the spatial covariance parameters and makes spatial predictions with uncertainty quantification for point-referenced spatial data. “Kryging” combines Kriging and Krylov subspace methods and applies for both observations on regular grid and irregularly spaced observations, and for any Gaussian process with a stationary isotropic (and certain geometrically anisotropic) covariance function, including the popular Matérn  covariance family. We make use of the block Toeplitz structure with Toeplitz blocks of the covariance matrix and use fast Fourier transform methods to bypass the computational and memory bottlenecks of approximating log-determinant and matrix-vector products. We perform extensive simulation studies to show the effectiveness of our model by varying sample sizes, spatial parameter values and sampling designs. A real data application is also performed on a dataset consisting of land surface temperature readings taken by the MODIS satellite. Compared to existing methods, the proposed method performs satisfactorily with much less computation time and better scalability.
      PubDate: 2022-09-08
       
  • Improving bridge estimators via f-GAN

    • Free pre-print version: Loading...

      Abstract: Abstract Bridge sampling is a powerful Monte Carlo method for estimating ratios of normalizing constants. Various methods have been introduced to improve its efficiency. These methods aim to increase the overlap between the densities by applying appropriate transformations to them without changing their normalizing constants. In this paper, we first give a new estimator of the asymptotic relative mean square error (RMSE) of the optimal Bridge estimator by equivalently estimating an f-divergence between the two densities. We then utilize this framework and propose f-GAN-Bridge estimator (f-GB) based on a bijective transformation that maps one density to the other and minimizes the asymptotic RMSE of the optimal Bridge estimator with respect to the densities. This transformation is chosen by minimizing a specific f-divergence between the densities. We show f-GB is optimal in the sense that within any given set of candidate transformations, the f-GB estimator can asymptotically achieve an RMSE lower than or equal to that achieved by Bridge estimators based on any other transformed densities. Numerical experiments show that f-GB outperforms existing methods in simulated and real-world examples. In addition, we discuss how Bridge estimators naturally arise from the problem of f-divergence estimation.
      PubDate: 2022-09-03
       
  • A non-stationary model for spatially dependent circular response data
           based on wrapped Gaussian processes

    • Free pre-print version: Loading...

      Abstract: Abstract Circular data can be found across many areas of science, for instance meteorology (e.g., wind directions), ecology (e.g., animal movement directions), or medicine (e.g., seasonality in disease onset). The special nature of these data means that conventional methods for non-periodic data are no longer valid. In this paper, we consider wrapped Gaussian processes and introduce a spatial model for circular data that allow for non-stationarity in the mean and the covariance structure of Gaussian random fields. We use the empirical equivalence between Gaussian random fields and Gaussian Markov random fields which allows us to considerably reduce computational complexity by exploiting the sparseness of the precision matrix of the associated Gaussian Markov random field. Furthermore, we develop tunable priors, inspired by the penalized complexity prior framework, that shrink the model toward a less flexible base model with stationary mean and covariance function. Posterior estimation is done via Markov chain Monte Carlo simulation. The performance of the model is evaluated in a simulation study. Finally, the model is applied to analyzing wind directions in Germany.
      PubDate: 2022-09-03
       
  • Convergence rate bounds for iterative random functions using one-shot
           coupling

    • Free pre-print version: Loading...

      Abstract: Abstract One-shot coupling is a method of bounding the convergence rate between two copies of a Markov chain in total variation distance, which was first introduced in Roberts and Rosenthal (Process Appl 99:195–208, 2002) and generalized in Madras and Sezer (Bernoulli 16:882–908, 2010). The method is divided into two parts: the contraction phase, when the chains converge in expected distance and the coalescing phase, which occurs at the last iteration, when there is an attempt to couple. One-shot coupling does not require the use of any exogenous variables like a drift function or a minorization constant. In this paper, we summarize the one-shot coupling method into the One-Shot Coupling Theorem. We then apply the theorem to two families of Markov chains: the random functional autoregressive process and the autoregressive conditional heteroscedastic process. We provide multiple examples of how the theorem can be used on various models including ones in high dimensions. These examples illustrate how the theorem’s conditions can be verified in a straightforward way. The one-shot coupling method appears to generate tight geometric convergence rate bounds.
      PubDate: 2022-09-02
       
  • Variational inference and sparsity in high-dimensional deep Gaussian
           mixture models

    • Free pre-print version: Loading...

      Abstract: Abstract Gaussian mixture models are a popular tool for model-based clustering, and mixtures of factor analyzers are Gaussian mixture models having parsimonious factor covariance structure for mixture components. There are several recent extensions of mixture of factor analyzers to deep mixtures, where the Gaussian model for the latent factors is replaced by a mixture of factor analyzers. This construction can be iterated to obtain a model with many layers. These deep models are challenging to fit, and we consider Bayesian inference using sparsity priors to further regularize the estimation. A scalable natural gradient variational inference algorithm is developed for fitting the model, and we suggest computationally efficient approaches to the architecture choice using overfitted mixtures where unnecessary components drop out in the estimation. In a number of simulated and two real examples, we demonstrate the versatility of our approach for high-dimensional problems, and demonstrate that the use of sparsity inducing priors can be helpful for obtaining improved clustering results.
      PubDate: 2022-09-01
       
  • A Joint estimation approach to sparse additive ordinary differential
           equations

    • Free pre-print version: Loading...

      Abstract: Abstract Ordinary differential equations (ODEs) are widely used to characterize the dynamics of complex systems in real applications. In this article, we propose a novel joint estimation approach for generalized sparse additive ODEs where observations are allowed to be non-Gaussian. The new method is unified with existing collocation methods by considering the likelihood, ODE fidelity and sparse regularization simultaneously. We design a block coordinate descent algorithm for optimizing the non-convex and non-differentiable objective function. The global convergence of the algorithm is established. The simulation study and two applications demonstrate the superior performance of the proposed method in estimation and improved performance of identifying the sparse structure.
      PubDate: 2022-08-23
       
  • A joint latent factor analyzer and functional subspace model for
           clustering multivariate functional data

    • Free pre-print version: Loading...

      Abstract: Abstract We introduce a model-based approach for clustering multivariate functional data observations. We utilize theoretical results regarding a surrogate density on the truncated Karhunen–Loeve expansions along with a direct sum specification of the functional space to define a matrix normal distribution on functional principal components. This formulation allows for individual parsimonious modelling of the function space and coefficient space of the univariate components of the multivariate functional observations in the form a subspace projection and latent factor analyzers, respectively. The approach facilitates interpretation at both the full multivariate level and the component level, which is of specific interest when the component functions have clear meaning. We derive an AECM algorithm for fitting the model, and discuss appropriate initialization strategies, convergence and model selection criteria. We demonstrate the model’s applicability through simulation and two data analyses on observations that have many functional components.
      PubDate: 2022-08-20
       
  • Computing marginal likelihoods via the Fourier integral theorem and
           pointwise estimation of posterior densities

    • Free pre-print version: Loading...

      Abstract: Abstract In this paper, we present a novel approach to the estimation of a density function at a specific chosen point. With this approach, we can estimate a normalizing constant, or equivalently compute a marginal likelihood, by focusing on estimating a posterior density function at a point. Relying on the Fourier integral theorem, the proposed method is capable of producing quick and accurate estimates of the marginal likelihood, regardless of how samples are obtained from the posterior; that is, it uses the posterior output generated by a Markov chain Monte Carlo sampler to estimate the marginal likelihood directly, with no modification to the form of the estimator on the basis of the type of sampler used. Thus, even for models with complicated specifications, such as those involving challenging hierarchical structures, or for Markov chains obtained from a black-box MCMC algorithm, the method provides a straightforward means of quickly and accurately estimating the marginal likelihood. In addition to developing theory to support the favorable behavior of the estimator, we also present a number of illustrative examples.
      PubDate: 2022-08-16
       
  • Statistic selection and MCMC for differentially private Bayesian
           estimation

    • Free pre-print version: Loading...

      Abstract: Abstract This paper concerns differentially private Bayesian estimation of the parameters of a population distribution, when a noisy statistic of a sample from that population is shared to provide differential privacy. This work mainly addresses two problems. (1) What statistics of the sample should be shared privately' For this question, we promote using the Fisher information. We find out that the statistic that is most informative in a non-privacy setting may not be the optimal choice under the privacy restrictions. We provide several examples to support that point. We consider several types of data sharing settings and propose several Monte Carlo-based numerical estimation methods for calculating the Fisher information for those settings. The second question concerns inference: (2) Based on the shared statistics, how could we perform effective Bayesian inference' We propose several Markov chain Monte Carlo (MCMC) algorithms for sampling from the posterior distribution of the parameter given the noisy statistic. The proposed MCMC algorithms can be preferred over one another depending on the problem. For example, when the shared statistic is additive and added Gaussian noise, a simple Metropolis-Hasting algorithm that utilises the central limit theorem is a decent choice. We propose more advanced MCMC algorithms for several other cases of practical relevance. Our numerical examples involve comparing several candidate statistics to be shared privately. For each statistic, we perform Bayesian estimation based on the posterior distribution conditional on the privatised version of that statistic. We demonstrate that the relative performance of a statistic, in terms of the mean squared error of the Bayesian estimator based on the corresponding privatised statistic, is adequately predicted by the Fisher information of the privatised statistic.
      PubDate: 2022-08-16
       
  • A 4D-Var method with flow-dependent background covariances for the
           shallow-water equations

    • Free pre-print version: Loading...

      Abstract: Abstract The 4D-Var method for filtering partially observed nonlinear chaotic dynamical systems consists of finding the maximum a-posteriori (MAP) estimator of the initial condition of the system given observations over a time window, and propagating it forward to the current time via the model dynamics. This method forms the basis of most currently operational weather forecasting systems. In practice the optimisation becomes infeasible if the time window is too long due to the non-convexity of the cost function, the effect of model errors, and the limited precision of the ODE solvers. Hence the window has to be kept sufficiently short, and the observations in the previous windows can be taken into account via a Gaussian background (prior) distribution. The choice of the background covariance matrix is an important question that has received much attention in the literature. In this paper, we define the background covariances in a principled manner, based on observations in the previous b assimilation windows, for a parameter \(b\ge 1\) . The method is at most b times more computationally expensive than using fixed background covariances, requires little tuning, and greatly improves the accuracy of 4D-Var. As a concrete example, we focus on the shallow-water equations. The proposed method is compared against state-of-the-art approaches in data assimilation and is shown to perform favourably on simulated data. We also illustrate our approach on data from the recent tsunami of 2011 in Fukushima, Japan.
      PubDate: 2022-08-11
       
 
JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
 


Your IP address: 3.237.27.159
 
Home (Search)
API
About JournalTOCs
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-