Abstract: The situation where serviceable products are sold together with a proportion of deteriorating products to consumers is rarely discussed in the literature. This article proposes an inventory model with disparate inventory ordering policies under a situation where a portion of serviceable products and a portion of deteriorating products are sold together to consumers (i.e. mixed sales). The ordering policies consider a hybrid payment strategy with multiple prepayment and partial trade credit schemes linked to order quantity under situations where no inventory shortage is allowed and inventory shortage is allowed with full backorder. The hybrid payment policy offered by a supplier is introduced into the classical economic ordering quantity model to investigate the optimal inventory cycle and the fraction of demand that is filled from the deteriorating products under inspection policy. Further, a new solution method is proposed that identifies optimal annual total profit with mixed sales assuming no inventory shortage and inventory shortage with full backorder. The impact of an inspection policy is investigated on the optimality of the solution under hybrid payment strategies for the deteriorating products. The validation of the proposed model and its solution method is demonstrated through several numerical examples. The results indicate that the inventory model along with the solution method provide a powerful tool to the retail managers under real-world situations. Results demonstrate that it is essential for the managers to consider inclusion of an inspection policy in the mixed sales of products, as the inspection policy significantly increases the net annual profit. PubDate: 2019-09-21

Abstract: A cyber-physical system (CPS) is composed of a discrete number of cyber and physical components and subject to internal failures and external disruptions. The functionality of CPS therefore is determined not only by cyber and physical components but the adversary’s attacker strategy. We characterize the effect of cyber-physical interdependency on the CPS survival probability using a product-form function with cyber and physical exponential correlation coefficients. We model simultaneous and sequential discrete games between the provider and attacker on a CPS infrastructure to analyze its survivability and reinforcement strategy at Nash equilibrium. Our results show that the cyber and physical correlation coefficients can significantly affect CPS survival probability. In general, the provider’s cyber- (or physical-) reinforcement level increases as the cyber- (or physical-) attack level increases. In each of cyber and physical domains, the reinforcement level first increases then decreases in its own correlation coefficient, probability of successful component attacks, and maximum level of available resources, but decreases in the correlation coefficient of the other domain. We apply this game-theoretic analysis to a cloud computing infrastructure, and show that its residual capacity is relatively high when the attacker has no information about the distribution of servers. Also, a high level of survival probability does not necessarily lead to high utility. PubDate: 2019-09-21

Abstract: Determining the best location to serve companies’ profitability and sustainability is becoming more crucial every day, since the rivalry between companies is getting more intense. The transformation of economies from manufacturing orientation towards service based activities has resulted in a growing contribution of the service based economy in gross domestic product and workforce of developing countries. These recent changes in the economy are indicators that service facility related location science has received greater interest. Service location problems has been studied since the 1900s and interest on these types of problems has started to grow especially after the aforementioned economic transformation in the 2000s. A large number of problems have been investigated for different service facilities. However, there is a need for a survey that systematically classifies these papers in order to comprehend them thoroughly due to their prominence and complexity. This paper examines 90 papers that have been published on service facility location problems since 2000. The paper presents a classification based on 19 main characteristics including key features and descriptive dimensions of location problems in order to develop a taxonomy from an operations research perspective to assist the location scientists and practitioners who work on service facility location problems. Furthermore, service facility location problems are categorized according to their application fields and investigated in detail relating to each characteristic. We also draw interesting comparisons of characteristics between facility location problems in different application fields and highlight directions for future research. PubDate: 2019-09-21

Abstract: This paper discusses the usefulness of the long term memory property in price prediction. In particular, the Hurst’s exponents related to a wide set of portfolios generated by three crude oils are estimated by using the detrended fluctuation analysis. To this aim, the daily empirical data on West Texas Intermediate, Brent crude oil and Dubai crude oil for a period of more than 10 years have been considered. It is shown that specific combinations are associated to persistence/antipersistence long-run behaviors, and this highlights the presence of statistical arbitrage opportunities. Such an outcome shows that long term memory can effectively serve as price predictor. PubDate: 2019-09-20

Abstract: The 0/1 knapsack polytope is the convex hull of all 0/1 vectors that satisfy a given single linear inequality with non-negative coefficients. This paper provides a comprehensive overview of knapsack polytopes. We discuss basic polyhedral properties, (lifted) cover and other valid inequalities, cases for which complete linear descriptions are known, geometric properties for small dimensions, and connections to independence systems. We also discuss the generalization to (mixed-)integer knapsack polytopes and variants. PubDate: 2019-09-19

Abstract: We study cooperative interval games. These are cooperative games where the value of a coalition is given by a closed real interval specifying a lower bound and an upper bound of the possible outcome. For interval cooperative games, several (interval) solution concepts have been introduced in the literature. We assume that each player has a different attitude towards uncertainty by means of the so-called Hurwicz coefficients. These coefficients specify the degree of optimism that each player has so that an interval becomes a specific payoff. We show that a classical cooperative game arises when applying the Hurwicz criterion to each interval game. On the other hand, the same Hurwicz criterion can also be applied to any interval solution of the interval cooperative game. Given this, we say that a solution concept is Hurwicz compatible if the two procedures provide the same final payoff allocation. When such compatibility is possible, we characterize the class of compatible solutions, which reduces to the egalitarian solution when symmetry is required. The Shapley value and the core solution cases are also discussed. PubDate: 2019-09-18

Abstract: The limiting availability of an ordinary k-out-of-n : G system can be improved simply by operating exactly k operable units while keeping any other operable units on cold standby. We establish this truth when there are different numbers of repair facilities, the repair time is exponential, and either each component has an exponential lifetime, or each component is made up of m identical sub-components that operate sequentially, with each sub-component having an exponential lifetime. PubDate: 2019-09-17

Abstract: Recent studies in Lee and Prékopa (Oper Res Lett 45:19–24, 2017) and Lee (Oper Res Lett 45:1204–1220, 2017) showed that a union of partially ordered orthants in \(R^n\) can be decomposed only into the largest and the second largest chains. This allows us to calculate the probability of the union of such events in a recursive manner. If the vertices of such orthants designate p-level efficient points, i.e., the multivariate quantile or the multivariate value-at-risk (MVaR) in \(R^n\) , then the number of them, say N, is typically very large, which makes it almost impossible to calculate the multivariate conditional value-at-risk (MCVaR) introduced by Prékopa (Ann Oper Res 193(1):49–69, 2012). This is because it takes \(O(2^N)\) in case of N MVaRs in \(R^n\) to find the exact value of MCVaR. In this paper, upon the basis of ideas in Lee and Prékopa (Oper Res Lett 45:19–24, 2017) and Lee (Oper Res Lett 45:1204–1220, 2017), together with proper adjustments, we study efficient methods for the calculation of the MCVaR without resorting to an approximation. In fact, the proposed methods not only have polynomial time complexity but also computes the exact value of MCVaR. We also discuss additional benefits MCVaR has to offer over its univariate counter part, the conditional value-at-risk, by providing numerical results. Numerical examples are presented with computing time in both cases of given population and sample data sets. PubDate: 2019-09-17

Abstract: We consider the problem of computing the credit value adjustment (CVA) of a European option in presence of the wrong way risk in a default intensity setting. Namely we model the asset price evolution as solution to a linear equation that might depend on different stochastic factors and we provide an approximate evaluation of the option’s price, by exploiting a correlation expansion approach, introduced in Antonelli and Scarlatti (Finance Stoch 13:269–303, 2009). We also extend our theoretical analysis to include some further value adjustments, for instance due to collateralization and funding costs. Finally, in the CVA case, we compare the numerical performance of our method with the one recently proposed by Brigo and Vrins (Eur J Oper Res 269:1154–1164, 2018) and Brigo et al. (Innovations in insurance, risk and asset management, WSPC proceedings, 2018), in the case of a call option driven by a GBM correlated with a CIR default intensity. We additionally compare with the numerical evaluations obtained by other methods. PubDate: 2019-09-16

Abstract: Accounting for the non-normality of asset returns remains one of the main challenges in portfolio optimization. In this paper, we tackle this problem by assessing the risk of the portfolio through the “amount of randomness” conveyed by its returns. We achieve this using an objective function that relies on the exponential of Rényi entropy, an information-theoretic criterion that precisely quantifies the uncertainty embedded in a distribution, accounting for higher-order moments. Compared to Shannon entropy, Rényi entropy features a parameter that can be tuned to play around the notion of uncertainty. A Gram–Charlier expansion shows that it controls the relative contributions of the central (variance) and tail (kurtosis) parts of the distribution in the measure. We further rely on a non-parametric estimator of the exponential Rényi entropy that extends a robust sample-spacings estimator initially designed for Shannon entropy. A portfolio-selection application illustrates that minimizing Rényi entropy yields portfolios that outperform state-of-the-art minimum-variance portfolios in terms of risk-return-turnover trade-off. We also show how Rényi entropy can be used in risk-parity strategies. PubDate: 2019-09-14

Abstract: This erratum is published because vendor overlooked corrections and section headings were given in numbered and run-on format. PubDate: 2019-09-11

Abstract: Emissions trading schemes have been widely implemented by many countries to enforce the “cap and trade” concept for mitigating CO2 emissions. Thus, the carbon price influences the manufacturing costs in all stages of production, recycling, and disposal. Consideration of the carbon price is especially important for the economic efficiency of the downstream manufacturing sectors, such as in plastic product manufacturing, to substantially reduce their costs through the design and management of networked supply chains, which results in purchasing feedstocks from different technological routes, as well as choosing plants, warehouses and various transportation modes with diverse CO2 emission intensities. Supporting the decision-making in such situations requires the integration of life cycle analysis and networked supply chain management methodologies with an analysis of the carbon-market uncertainties. Such approaches have not been sufficiently quantified in the existing literature. This study presents a stochastic mixed-integer linear programming model developed for polyvinyl chloride pipe manufacturing in China, which is used to evaluate the effects of the life cycle emissions of procurement on the whole supply chain under carbon market uncertainty. Our results illustrate that the carbon market uncertainty would not only significantly influence the carbon-intensive production sectors but also the downstream manufacturing sectors. The five scenarios with carbon price variation exhibit distinctively different choices in procurement and supply chain configurations, as well as in their performances regarding total emissions and associated costs. PubDate: 2019-09-11

Abstract: Construction of new dams in undeveloped transboundary basins causes two serious disputes between the stakeholders: conflicts over more water interest and over the new dams’ locations. Hence, water development planning of these basins needs to be done in conjunction with the examination of stakeholders’ new water shares. This study extends the model presented in Roozbahani et al. (Water Resour Manag 31:4539–4556, 2017) to be multi-objective and applies the methodology outlined in Roozbahni et al. (Ann Oper Res 229:657–676, 2015a) to solve the model. The proposed three steps approach determines the equitable allocation of the surface water of an undeveloped transboundary basin while determining optimal number, locations and capacities of new dams. The first step utilizes a mixed-integer-multi-objective model to outline the water shares of stakeholders, as well as optimal dam locations for a given number of dams. Using a sensitivity analysis, the second step pinpoints the required number of dams. The role of third step is the exploration of the dams’ lowest possible capacities. Environmentally, our approach takes the entire watershed’s water requirements into account. We have applied the proposed approach to the Sefidrud Basin, a transboundary basin located in Iran. The results of the approach show that, to significantly improve the security of the Sefidrud Basin’s water supply, three new dams would be optimal. PubDate: 2019-09-11

Abstract: This paper aims to develop a recovery planning approach in a three-tier manufacturing supply chain, which has a single supplier, manufacturer, and retailer under an imperfect production environment, in which we consider three types of sudden disturbances: demand fluctuation, and disruptions to production and raw material supply, which are not known in advance. Firstly, a mathematical model is developed for generating an ideal plan under imperfect production for a finite planning horizon while maximizing total profit, and then we re-formulate the model to generate the recovery plan after happening of each sudden disturbance. Considering the high commercial cost and computational intensity and complexity of this problem, we propose an efficient heuristic, to obtain a recovery plan, for each disturbance type, for a finite future period, after the occurrence of a disturbance. The heuristic solutions are compared with a standard solution technique for a considerable number of random test instances, which demonstrates the trustworthy performance of the developed heuristics. We also develop another heuristic for managing the combined effects of multiple sudden disturbances in a period. Finally, a simulation approach is proposed to investigate the effects of different types of disturbance events generated randomly. We present several numerical examples and random experiments to explicate the benefits of our developed approaches. Results reveal that in the event of sudden disturbances, the proposed mathematical and heuristic approaches are capable of generating recovery plans accurately and consistently. PubDate: 2019-09-01

Abstract: We describe several analytical results obtained in four candidates social choice elections under the assumption of the Impartial Anonymous Culture. These include the Condorcet and Borda paradoxes, as well as the Condorcet efficiency of plurality voting with runoff. The computations are done by Normaliz. It finds precise probabilities as volumes of polytopes and counting functions encoded as Ehrhart series of polytopes. PubDate: 2019-09-01

Abstract: In this paper we present two mixed-integer programming formulations for the curriculum based course timetabling problem (CTT). We show that the formulations contain underlying network structures by dividing the CTT into two separate models and then connect the two models using flow formulation techniques. The first mixed-integer programming formulation is based on an underlying minimum cost flow problem, which decreases the number of integer variables significantly and improves the performance compared to an intuitive mixed-integer programming formulation. The second formulation is based on a multi-commodity flow problem which in general is NP-hard, however, we prove that it suffices to solve the linear programming relaxation of the model. The formulations show competitiveness with other approaches based on mixed-integer programming from the literature and improve the currently best known lower bound on one data instance in the benchmark data set from the second international timetabling competition. Regarding upper bounds, the formulation based on the minimum cost flow problem performs better on average than other mixed integer programming approaches for the CTT. PubDate: 2019-09-01

Abstract: The demand for perishable goods (e.g., baked goods, fruits, vegetables, meat, milk, and seafood) is influenced by product freshness which gradually declines over time and can be perceived by its expiration date. Also, selling price is an important factor on demand. Furthermore, most modern companies offer their products on various credit terms to increase sales. However, relatively little attention has been paid to the combined effects of selling price, expiration date and credit term affecting demand. In this paper, a three-echelon supplier-retailer-consumer supply chain for perishable goods is explored in which the retailer receives an upstream full trade credit from the supplier while granting a downstream partial trade credit to credit-risk customers, with demand as a multiplicative form of selling price, expiration date, and credit period. The proposed model includes numerous previous models as special cases. The optimal credit term, order size and selling price are derived simultaneously for the retailer to achieve maximum profit. Several numerical examples are conducted to gain managerial insights. For example, if the credit efficiency of demand increases, then the retailer shall offer a longer downstream credit period to raise sales volume, which in turn implies the retailer can raise a higher price, order a larger quantity, and gain a higher total profit. Conversely, an increase in portion of cash payment results in a lower demand rate. Hence, the retailer orders less quantity and earns less profit while decreasing price to stimulate sales. Finally, conclusions and future research directions are provided. PubDate: 2019-09-01

Abstract: This article addresses linear sharing rules on transferable utility games (TU-games) with various structures, namely communication structures and conference structures as defined by Myerson in two papers (Myerson in Mathematics of Operations Research 2:225–229, 1977; Myerson in International Journal of Game Theory 9:169–182, 1980). Here, using matrix expressions, we rewrite those sharing rules. With this presentation we identify the close relationship between the fairness property and an equal treatment of necessary players axiom. Moreover, we show that the latter is implied by the equal treatment of equals, linking the fairness property to the notion of equality. PubDate: 2019-09-01

Abstract: We develop a game-theoretic model to guide the choice of the reward structure in customer loyalty programs. We model a duopoly market in which one firm adopts a loyalty program. Firms independently and simultaneously set the prices and rewards. Heterogeneous customers buy homogeneous products in a multi-period setting. Customers are segmented into three groups based on their level of strategic behavior, which is expressed in terms of their degree of forward-lookingness. We use two exogenous parameters to represent the size of each segment. A third parameter captures the point pressure effect, which refers to the increase in customer spending as they approach a reward threshold. In each period, customers choose the firm that maximizes their utility, which is a function of offered prices, rewards, and the distance to the next reward. We use the logit model to model the customer choice behavior. Customers’ accumulated purchases evolve as a Markov chain. We derive the limiting distribution of accumulated purchases, which is subsequently used to formulate the firm’s expected revenue functions. We develop two algorithms to find the Nash equilibrium for both the linear and nonlinear rewards in term of the three parameters. Using a thorough numerical analysis, we show that the choice of the structure becomes more critical as the size of the strategic segment increases. The nonlinear scheme is superior when the size of the highly-strategic segment is very small. The linear rewards is superior in markets where the size of the highly-strategic segment and the sensitivity to distance are simultaneously not small. PubDate: 2019-09-01

Abstract: Capital budgeting optimization models, used in a broad number of fields, require certain and uncertain parameters. Often times, elicited subject matter expert (SME) opinion is used as a parameter estimate, which does not always yield perfect information or correspond to a single value. Because of the uncertainty of the elicitation, the unknown true value of a parameter can be modeled as a random variable from a to-be-determined distribution. We estimate a univariate distribution using four different approaches, the Beta and Gaussian distributions, a standard Gaussian Kernel estimate, and an exponential epi-spline. We also capture dependencies within the parameters through three multivariate approaches: the multivariate Gaussian distribution, the multivariate Kernel and the multivariate exponential epi-spline. This is the first three-dimensional application of the latter. Sampling from the densities, we generate scenarios and implement a superquantile risk-based, capital budgeting optimization model. Numerical experiments contrast the differences between estimators, as well as their effects on an optimal solution. Our findings demonstrate that naively averaging the SME observations for use in optimization, rather than incorporating uncertainty, results in an overly optimistic portfolio. The flexibility of the exponential epi-spline estimator to fuse soft information with observed data produces reasonable density functions for univariate and multivariate random variables. Including a decision-maker’s risk-averseness through risk-based optimization delivers conservative results while incorporating the uncertainty of unknown parameters. We demonstrate a 20% improvement for this specific case when using our approach as opposed to the naive method. PubDate: 2019-09-01