Authors:Wesley Lee; Bailey Fosdick, Tyler McCormick Abstract: Relational event data, which consist of events involving pairs of actors over time, are now commonly available at the finest of temporal resolutions. Existing continuous-time methods for modeling such data are based on point processes and directly model interaction “contagion,” whereby one interaction increases the propensity of future interactions among actors, often as dictated by some latent variable structure. In this article, we present an alternative approach to using temporal-relational point process models for continuous-time event data. We characterize interactions between a pair of actors as either spurious or as resulting from an underlying, persistent connection in a latent social network. We argue that consistent deviations from expected behavior, rather than solely high frequency counts, are crucial for identifying well-established underlying social relationships. This study aims to explore these latent network structures in two contexts: one comprising of college students and another involving barn swallows. PubDate: 2017-10-20T06:33:42.255857-05: DOI: 10.1002/asmb.2285

Authors:Meihua Wang; Fengmin Xu, Yu-Hong Dai Abstract: This paper investigates the portfolio strategy problem for passive fund management. We propose a novel portfolio strategy that combines the existing stratified strategy and optimized sampling strategy. The proposed method enables one to include adequate practical information in portfolio decision making, and promotes better out-of-sample performance. A mixed-integer program model is built that captures the stratification information, the cardinality requirement, and other practical constraints. The corresponding model is able to forecast and generate optimal tracking portfolios with high performance, especially in out-of-sample time period. As mixed-integer program is a well-known NP-hard problem, to tackle the computational challenge, we propose a stratified hybrid genetic algorithm, in which a novel crossover operator is introduced. To evaluate the proposed strategy and algorithm, we conduct numerical tests on real data sets collected from China Stock Exchange Markets. The experimental results show that the algorithm runs efficiently and the portfolio strategy performs significantly better than other existing strategies. PubDate: 2017-10-16T02:35:28.833612-05: DOI: 10.1002/asmb.2287

Authors:Mohamed Ben Mzoughia; Sharad Borle, Mohamed Limam Abstract: One of the major challenges associated with the measurement of customer lifetime value is selecting an appropriate model for predicting customer future transactions. Among such models, the Pareto/negative binomial distribution (Pareto/NBD) is the most prevalent in noncontractual relationships characterized by latent customer defections; ie, defections are not observed by the firm when they happen. However, this model and its applications have some shortcomings. Firstly, a methodological shortcoming is that the Pareto/NBD, like all lifetime transaction models based on statistical distributions, assumes that the number of transactions by a customer follows a Poisson distribution. However, many applications have an empirical distribution that does not fit a Poisson model. Secondly, a computational concern is that the implementation of Pareto/NBD model presents some estimation challenges specifically related to the numerous evaluation of the Gaussian hypergeometric function. Finally, the model provides 4 parameters as output, which is insufficient to link the individual purchasing behavior to socio-demographic information and to predict the behavior of new customers. In this paper, we model a customer's lifetime transactions using the Conway-Maxwell-Poisson distribution, which is a generalization of the Poisson distribution, offering more flexibility and a better fit to real-world discrete data. To estimate parameters, we propose a Markov chain Monte Carlo algorithm, which is easy to implement. Use of this Bayesian paradigm provides individual customer estimates, which help link purchase behavior to socio-demographic characteristics and an opportunity to target individual customers. PubDate: 2017-09-29T11:15:05.007395-05: DOI: 10.1002/asmb.2276

Authors:Mauricio Huerta; Víctor Leiva, Camilo Lillo, Marcelo Rodríguez Abstract: We propose a methodology based on partial least squares (PLS) regression models using the beta distribution, which is useful for describing data measured between zero and one. The beta PLS model parameters are estimated with the maximum likelihood method, whereas a randomized quantile residual and the generalized Cook and Mahalanobis distances are considered as diagnostic methods. A simulation study is provided for evaluating the performance of these diagnostic methods. We illustrate the methodology with real-world mining data. The results obtained in this study based on the beta PLS model and its diagnostics may be of interest for the mining industry. PubDate: 2017-09-29T11:14:50.698196-05: DOI: 10.1002/asmb.2278

Authors:Abdolsaeed Toomaj; Antonio Di Crescenzo, Mahdi Doostparast Abstract: This paper considers information properties of coherent systems when component lifetimes are independent and identically distributed. Some results on the entropy of coherent systems in terms of ordering properties of component distributions are proposed. Moreover, various sufficient conditions are given under which the entropy order among systems as well as the corresponding dual systems hold. Specifically, it is proved that under some conditions, the entropy order among component lifetimes is preserved under coherent system formations. The findings are based on system signatures as a useful measure from comparison purposes. Furthermore, some results on the system's entropy are derived when lifetimes of components are dependent and identically distributed. Several illustrative examples are also given. PubDate: 2017-09-15T06:30:26.599466-05: DOI: 10.1002/asmb.2277

Authors:Athanasios C. Rakitzis; Christian H. Weiß, Philippe Castagliola Abstract: Correlated count data processes with a finite range can be adequately described by a first-order binomial autoregressive model. However, in several practical applications, these data demonstrate extra-binomial variation, and a more appropriate choice is the first-order beta-binomial autoregressive model. In this paper, we propose and study control charts that can be used for the monitoring of these 2 processes. Practical guidelines concerning their statistical design are provided, whereas the effect of the extra-binomial variation is investigated as well. Finally, the practical application of the proposed schemes is illustrated via a real-data example. PubDate: 2017-09-15T05:35:58.938467-05: DOI: 10.1002/asmb.2275

Authors:A. Mostajeran; N. Iranpanah, R. Noorossana Abstract: Various charts such as S , W, and G are used for monitoring process dispersion. Most of these charts are based on the normality assumption, while exact distribution of the control statistic is unknown, and thus limiting distribution of control statistic is employed which is applicable for large sample sizes. In practice, the normality assumption of distribution might be violated, while it is not always possible to collect large sample size. Furthermore, to use control charts in practice, the in-control state usually has to be estimated. Such estimation has a negative effect on the performance of control chart. Non-parametric bootstrap control charts can be considered as an alternative when the distribution is unknown or a collection of large sample size is not possible or the process parameters are estimated from a Phase I data set. In this paper, non-parametric bootstrap multivariate control charts S , W, and G are introduced, and their performances are compared against Shewhart-type control charts. The proposed method is based on bootstrapping the data used for estimating the in-control state. Simulation results show satisfactory performance for the bootstrap control charts. Ultimately, the proposed control charts are applied to a real case study. PubDate: 2017-08-31T06:07:13.163924-05: DOI: 10.1002/asmb.2272

Authors:Rahim Mahmoudvand; Dimitrios Konstantinides, Paulo Canas Rodrigues Abstract: In this paper, we investigate the possibility of using multivariate singular spectrum analysis (SSA), a nonparametric technique in the field of time series analysis, for mortality forecasting. We consider a real data application with 9 European countries: Belgium, Denmark, Finland, France, Italy, Netherlands, Norway, Sweden, and Switzerland, over a period 1900 to 2009, and a simulation study based on the data set. The results show the superiority of multivariate SSA in comparison with the univariate SSA, in terms of forecasting accuracy. PubDate: 2017-08-25T05:36:03.401668-05: DOI: 10.1002/asmb.2274

Authors:J. B. Heaton; N. G. Polson, J. H. Witte Abstract: We develop a simple stock selection model to explain why active equity managers tend to underperform a benchmark index. We motivate our model with the empirical observation that the best performing stocks in a broad market index often perform much better than the other stocks in the index. Randomly selecting a subset of securities from the index may dramatically increase the chance of underperforming the index. The relative likelihood of underperformance by investors choosing active management likely is much more important than the loss those same investors take due to the higher fees of active management relative to passive index investing. Thus, active management may be even more challenging than previously believed, and the stakes for finding the best active managers may be larger than previously assumed. PubDate: 2017-08-22T03:50:35.583666-05: DOI: 10.1002/asmb.2271

Authors:D. Neira; G. Soto, J. Fontbona, J. Prado, S. Gaete Abstract: Microseismic sensing networks are important tools for the assessment and control of geomechanical hazards in underground mining operations. In such a setting, the maintenance of a healthy network, that is, one that accurately registers all microseisms above some minimum energy level with acceptable levels of noise, is crucially relevant.In this paper, we develop a nondisruptive method to monitor the health of such a network, by associating with each sensor a set of performance indexes, inspired from reliability engineering, which are estimated from the set of registered signals. Our method addresses 2 relevant features of each of the sensors' behavior, namely, what type of noise is or might be affecting the registering process, and how effective at registering microseisms the sensor is.The method is evaluated through a case study with microseismic data registered at the Chilean underground mine El Teniente. This study illustrates our method's capability to discriminate and rank sensors with satisfactory, poor, or defective sensing performances, as well as to characterize their failure profile or type, an information that can be used to plan or optimize the network maintenance procedures. PubDate: 2017-08-15T03:25:31.158176-05: DOI: 10.1002/asmb.2266

Authors:Peng Jiang; Peter Craig, Alan Crosky, Mojtaba Maghrebi, Ismet Canbulat, Serkan Saydam Abstract: In recent years, there has been an increasing incidence of failure of rock bolts due to stress corrosion cracking and localized corrosion attack in Australian underground coal mines. Unfortunately, prediction of the risk of failure from results obtained from laboratory testing is not necessarily reliable because it is difficult to properly simulate the mine environment. An alternative way of predicting failure is to apply machine learning methods to data obtained from underground mines. In this paper, support vector machines are built to predict failure of bolts in complex mine environments. Feature transformation and feature selection methods are applied to extract useful information from the original data. A dataset, which had continuous features and spatial data, was used to test the proposed model. The results showed that principal component analysis-based feature transformation provides reliable risk prediction. PubDate: 2017-08-15T03:10:43.829851-05: DOI: 10.1002/asmb.2273

Authors:Giuseppe Lamberti; Tomas Banet Aluja, Gaston Sanchez Abstract: The problem of heterogeneity represents a very important issue in the decision-making process. Furthermore, it has become common practice in the context of marketing research to assume that different population parameters are possible depending on sociodemographic and psycho-demographic variables such as age, gender, and social status. In recent decades, numerous approaches have been proposed with the aim of involving heterogeneity in the parameter estimation procedures. In partial least squares path modeling, the common practice consists of achieving a global measurement of the differences arising from heterogeneity. This leaves the analyst with the important task of detecting, a posteriori, which are the causal relationships (ie, path coefficients) that produce changes in the model. This is the case in Pathmox analysis, which solves the heterogeneity problem by building a binary tree to detect those segments of population that cause the heterogeneity. In this article, we propose extending the same Pathmox methodology to asses which particular endogenous equation of the structural model and which path coefficients are responsible of the difference. PubDate: 2017-08-11T04:46:22.851701-05: DOI: 10.1002/asmb.2270

Authors:Sílvio Alves Souza; Denise Duarte, Eduardo M. A. M. Mendes Abstract: In this work, a set of sequences of information (time series), under nonstationary regime, with continuous space state, discrete time, and a Markovian dependence, is considered. A new model that expresses the marginal transition density function of one sequence as a linear combination of the marginal transition density functions of all sequences in the set is proposed. The coefficients of this combination are denominated marginal contribution coefficients and represent how much each transition density function contributes to the calculation of a chosen transition density function. The proposed coefficient is a marginal coefficient because it can be computed instantaneously, and it may change from one time to another time since all calculations are performed before stationarity is reached. This clearly differentiates the new coefficient from well-known measures such as the cross-correlation and the coherence. The idea behind the model is that if a specific sequence has a high marginal contribution for the transition density function from another sequence, the first may be replaced by the latter without losing much information that means that the knowledge of few densities should be enough to recover the overall behaviour. Simulations, considering 2 chains, are presented so as to check the sensitivity of the proposed model. The methodology is also applied to a real data originated from a wire-drawing machine whose main function is to decrease the transverse diameter of metal wires. The behaviour of the level of acceleration of each bearing in relation to the other ones is then verified. PubDate: 2017-08-01T23:01:25.525366-05: DOI: 10.1002/asmb.2262

Authors:Nina Yan; Chongqing Liu, Ye Liu, Baowen Sun Abstract: We constructed a Stackelberg game in a supply chain finance (SCF) system including a manufacturer, a capital-constrained retailer, and a bank that provides loans on the basis of the manufacturer's credit guarantee. To emphasize the financial service providers' risks, we assumed that both the bank and the manufacturer are risk-averse and formulated trade-off objective functions for both of them as the convex combination of the expected profit and conditional value-at-risk. To explore the effects of the risk preferences and decision preferences on SCF equilibriums, we mathematically analyzed the optimal order quantities, wholesale prices, and interest rates under different risk preference scenarios and performed numerical analyses to quantify the effects. We found that incorporating bank credit with a credit guarantee can effectively balance the retailer's financing risk between the bank and the manufacturer through interest rate charging and wholesale pricing. Moreover, SCF equilibriums with risk aversion are highly affected by the degree of both the lender's and guarantor's risk tolerance in regard to the borrower's default probability and will be more conservative than those in the risk-neutral cases that only maximize expected profit. PubDate: 2017-07-27T02:30:38.598256-05: DOI: 10.1002/asmb.2264

Authors:John Tyssedal; Muhammad Azam Chaudhry Abstract: A screening design is an experimental plan used for identifying the expectedly few active factors from potentially many. In this paper, we compare the performances of 3 experimental plans, a Plackett-Burman design, a minimum run resolution IV design, and a definitive screening design, all with 12 and 13 runs, when they are used for screening and 3 out of 6 factors are active. The functional relationship between the response and the factors was allowed to be of 2 types, a second-order model and a model with all main effects and interactions included. D-efficiencies for the designs ability to estimate parameters in such models were computed, but it turned out that these are not very informative for comparing the screening performances of the 2-level designs to the definitive screening design. The overall screening performance of the 2-level designs was quite good, but there exist situations where the definitive screening design, allowing both screening and estimation of second-order models in the same operation, has a reasonable high probability of being successful. PubDate: 2017-07-27T02:20:24.213014-05: DOI: 10.1002/asmb.2269

Authors:Masoud Shakibayifar; Erfan Hassannayebi, Hossein Jafary, Arman Sajedinejad Abstract: Urban rail planning is extremely complex, mainly because it is a decision problem under different uncertainties. In practice, travel demand is generally uncertain, and therefore, the timetabling decisions must be based on accurate estimation. This research addresses the optimization of train timetable at public transit terminals of an urban rail in a stochastic setting. To cope with stochastic fluctuation of arrival rates, a two-stage stochastic programming model is developed. The objective is to construct a daily train schedule that minimizes the expected waiting time of passengers. Due to the high computational cost of evaluating the expected value objective, the sample average approximation method is applied. The method provided statistical estimations of the optimality gap as well as lower and upper bounds and the associated confidence intervals. Numerical experiments are performed to evaluate the performance of the proposed model and the solution method. PubDate: 2017-07-20T03:40:45.856594-05: DOI: 10.1002/asmb.2268

Authors:Johan Lim; Sungim Lee Abstract: The generalized T2 chart (GT-chart), which is composed of the T2 statistic based on a small number of principal components and the remaining components, is a popular alternative to the traditional Hotelling's T2 control chart. However, the application of the GT-chart to high-dimensional data, which are now ubiquitous, encounters difficulties from high dimensionality similar to other multivariate procedures. The sample principal components and their eigenvalues do not consistently estimate the population values, and the GT-chart relying on them is also inconsistent in estimating the control limits. In this paper, we investigate the effects of high dimensionality on the GT-chart and then propose a corrected GT-chart using the recent results of random matrix theory for the spiked covariance model. We numerically show that the corrected GT-chart exhibits superior performance compared to the existing methods, including the GT-chart and Hotelling's T2 control chart, under various high-dimensional cases. Finally, we apply the proposed corrected GT-chart to monitor chemical processes introduced in the literature. PubDate: 2017-07-16T23:30:31.640334-05: DOI: 10.1002/asmb.2267

Authors:Samir P. Warty; Hedibert F. Lopes, Nicholas G. Polson Abstract: In this work, we investigate sequential Bayesian estimation for inference of stochastic volatility with variance-gamma (SVVG) jumps in returns. We develop an estimation algorithm that combines the sequential learning auxiliary particle filter with the particle learning filter. Simulation evidence and empirical estimation results indicate that this approach is able to filter latent variances, identify latent jumps in returns, and provide sequential learning about the static parameters of SVVG. We demonstrate comparative performance of the sequential algorithm and off-line Markov Chain Monte Carlo in synthetic and real data applications. PubDate: 2017-06-21T03:26:48.533604-05: DOI: 10.1002/asmb.2258

Authors:Julio Mulero; Miguel A. Sordo, Marilia C. de Souza, Alfonso Suárez-LLorens Abstract: Actuarial risks and financial asset returns are typically heavy tailed. In this paper, we introduce 2 stochastic dominance criteria, called the right-tail order and the left-tail order, to compare these variables stochastically. The criteria are based on comparisons of expected utilities, for 2 classes of utility functions that give more weight to the right or the left tail (depending on the context) of the distributions. We study their properties, applications, and connections with other classical criteria, including the increasing convex and the second-order stochastic dominance. Finally, we rank some parametric families of distributions and provide empirical evidence of the new stochastic dominance criteria with an example using real data. PubDate: 2017-06-19T00:21:00.010155-05: DOI: 10.1002/asmb.2260

Authors:Wanbo Lu; Rui Ke Abstract: In this paper, we extend the closed form moment estimator (ordinary MCFE) for the autoregressive conditional duration model given by Lu et al (2016) and propose some closed form robust moment-based estimators for the multiplicative error model to deal with the additive and innovational outliers. The robustification of the closed form estimator is done by replacing the sample mean and sample autocorrelation with some robust estimators. These estimators are more robust than the quasi-maximum likelihood estimator (QMLE) often used to estimate this model, and they are easy to implement and do not require the use of any numerical optimization procedure and the choice of initial value. The performance of our proposal in estimating the parameters and forecasting conditional mean μt of the MEM(1,1) process is compared with the proposals existing in the literature via Monte Carlo experiments, and the results of these experiments show that our proposal outperforms the ordinary MCFE, QMLE, and least absolute deviation estimator in the presence of outliers in general. Finally, we fit the price durations of IBM stock with the robust closed form estimators and the benchmarks and analyze their performances in estimating model parameters and forecasting the irregularly spaced intraday Value at Risk. PubDate: 2017-06-19T00:16:35.174062-05: DOI: 10.1002/asmb.2259