Authors:Michel Dacorogna Pages: 211 - 232 Abstract: In this paper, we review changes in the insurance industry due to new risk-based regulations such as Solvency 2 and Swiss Solvency Test. The move from corporate management based on cash-flow to risk-based management is described and discussed through its consequences on capital management, economic valuation and the internal model. We discuss the limits and difficulties of enterprise risk management and its effect on the organisation of companies and the role of actuaries in insurance. The risk/return relation is becoming a central element of the company’s management slowly supplanting the traditional accounting view. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499518000040 Issue No:Vol. 12, No. 2 (2018)

Authors:J. Lévy Véhel Pages: 233 - 248 Abstract: In this note, we provide a simple example of regulation risk. The idea is that, in certain situations, the very prudential rules (or, rather, some of them) imposed by the regulator in the framework of the Basel II/III Accords or Solvency II directive are themselves the source of a systemic risk. The instance of regulation risk that we bring to light in this work can be summarised as follows: wrongly assuming that prices evolve in a continuous fashion when they may in fact display large negative jumps, and trying to minimise Value at Risk (VaR) under a constraint of minimal volume of activity leads in effect to behaviours that will maximise VaR. Although much stylised, our analysis highlights some pitfalls of model-based regulation. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S174849951800009X Issue No:Vol. 12, No. 2 (2018)

Authors:Yasutaka Shimizu; Shuji Tanaka Pages: 249 - 268 Abstract: This article considers a dynamic version of risk measures for stochastic asset processes and gives a mathematical benchmark for required capital in a solvency regulation framework. Some dynamic risk measures, based on the expected discounted penalty function launched by Gerber and Shiu, are proposed to measure solvency risk from the company’s going-concern point of view. This study proposes a novel mathematical justification of a risk measure for stochastic processes as a map on a functional path space of future loss processes. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499518000064 Issue No:Vol. 12, No. 2 (2018)

Authors:Eric C. K. Cheung; Suhang Dai, Weihong Ni Pages: 269 - 295 Abstract: We analyse ruin probabilities for an insurance risk process with a more generalised dependence structure compared to the one introduced in Constantinescu et al. (2016). In this paper, we assume that a random threshold window is generated every time after a claim occurs. By comparing the previous inter-claim time with the threshold window, the distributions of the current threshold window and the inter-arrival time are determined. Furthermore, the statuses for the previous and current inter-arrival times give rise to the current claim size distribution as well. Like Constantinescu et al. (2016), we first identify the embedded Markov additive process where all the randomness takes a general form. Inspired by the Erlangisation technique, the key message of this paper is to analyse such risk process using a Markov fluid flow model where the underlying random variables follow phase-type distributions. This would further allow us to approximate the fixed observation windows by Erlang random variables. Then ruin probabilities under the process with Erlang(n) observation windows are proved to be Erlangian approximations for those related to the process with fixed threshold windows at the limit. An exact form of the limit can be obtained whose application will be illustrated further by a numerical example. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499517000215 Issue No:Vol. 12, No. 2 (2018)

Authors:Ran Xu; Jae-Kyung Woo, Xixuan Han, Hailiang Yang Pages: 296 - 325 Abstract: In this work, we propose a capital injection strategy which is periodically implemented based on the number of claims in the classical Poisson risk model. Especially, capital injection decisions are made at a predetermined accumulated number of claim instants, if the surplus is lower than a minimum required level. There appears to be a similar problem found in reliability theory such that preventive maintenance policies are performed at certain shock numbers. Assuming a combination of exponentials for the claim severities, we first derive an explicit expression for the discounted density of the surplus level after a certain number of claims if ruin has not yet occurred. Utilising this result, we study the expected total discounted capital injection until the first ruin time. To solve the differential equation associated with this quantity, we analyse an extended Lundberg’s fundamental equation. Similarly, an expression for the Laplace transform of the time to ruin is also explicitly found. Finally, we illustrate the applicability of the present capital injection strategy and methodologies through various numerical examples. In particular, for exponential claim severities, some optimal capital injection strategy which minimises the expected capital spending per unit time is numerically studied. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499518000180 Issue No:Vol. 12, No. 2 (2018)

Authors:Huanqun Jiang Pages: 326 - 337 Abstract: In this paper, we extend the optimality of the barrier strategy for the dividend payment problem to the setting that the underlying surplus process is a spectrally negative Lévy process and the discounting factor is an exponential Lévy process. The proof of the main result uses the fluctuation identities of spectrally negative Lévy processes. This extends recent results of Eisenberg for the case where the accumulated interest rate and surplus process are independent Brownian motions with drift. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499518000052 Issue No:Vol. 12, No. 2 (2018)

Authors:Chunhao Cai; Junyi Guo, Honglong You Pages: 338 - 349 Abstract: In this paper, we propose an estimator of the survival probability for a Lévy risk model observed at low frequency. The estimator is constructed via a regularised version of the inverse of the Laplace transform. The convergence rate of the estimator in a sense of the integrated squared error is studied for large sample size. Simulation studies are also given to show the finite sample performance of our estimator. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499517000100 Issue No:Vol. 12, No. 2 (2018)

Authors:François Dufresne; Enkelejd Hashorva, Gildas Ratovomirija, Youssouf Toukourou Pages: 350 - 371 Abstract: Insurance and annuity products covering several lives require the modelling of the joint distribution of future lifetimes. In the interest of simplifying calculations, it is common in practice to assume that the future lifetimes among a group of people are independent. However, extensive research over the past decades suggests otherwise. In this paper, a copula approach is used to model the dependence between lifetimes within a married couple using data from a large Canadian insurance company. As a novelty, the age difference and the gender of the elder partner are introduced as an argument of the dependence parameter. Maximum likelihood techniques are thus implemented for the parameter estimation. Not only do the results make clear that the correlation decreases with age difference, but also the dependence between the lifetimes is higher when husband is older than wife. A goodness-of-fit procedure is applied in order to assess the validity of the model. Finally, considering several annuity products available on the life insurance market, the paper concludes with practical illustrations. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499518000076 Issue No:Vol. 12, No. 2 (2018)

Authors:Ryan Timmer; John Paul Broussard, G. Geoffrey Booth Pages: 372 - 390 Abstract: We study the asset allocation decision of a life insurance company’s general account with respect to the possibility of large negative economic shocks and examine how this account is affected by policyholder investment decisions in the company’s separate account. This is accomplished using a performance metric that incorporates downside risk measured using univariate and multivariate extreme value distributions. Because of its well-known price volatility, diversification attributes, and significant weight in the combined general and separate accounts, our primary focus is the company’s equity investments. Although industry asset allocations have varied over the past two decades, we find that the actual allocations to equity in the general account are close to the allocation percentages suggested by our extreme value metrics and both are far below the maximum values indicated by the relevant regulatory bodies. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499517000264 Issue No:Vol. 12, No. 2 (2018)

Authors:Maissa Tamraz Pages: 391 - 411 Abstract: In the classical collective model over a fixed time period of two insurance portfolios, we are interested, in this contribution, in the models that relate to the joint distribution F of the largest claim amounts observed in both insurance portfolios. Specifically, we consider the tractable model where the claim counting random variable N follows a discrete-stable distribution with parameters (α,λ). We investigate the dependence property of F with respect to both parameters α and λ. Furthermore, we present several applications of the new model to concrete insurance data sets and assess the fit of our new model with respect to other models already considered in some recent contributions. We can see that our model performs well with respect to most data sets. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S174849951800012X Issue No:Vol. 12, No. 2 (2018)

Authors:Leonardo Rojas-Nandayapa; Wangyue Xie Pages: 412 - 432 Abstract: We consider phase-type scale mixture distributions which correspond to distributions of a product of two independent random variables: a phase-type random variable Y and a non-negative but otherwise arbitrary random variable S called the scaling random variable. We investigate conditions for such a class of distributions to be either light- or heavy-tailed, we explore subexponentiality and determine their maximum domains of attraction. Particular focus is given to phase-type scale mixture distributions where the scaling random variable S has discrete support – such a class of distributions has been recently used in risk applications to approximate heavy-tailed distributions. Our results are complemented with several examples. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499517000136 Issue No:Vol. 12, No. 2 (2018)

Authors:Michel Dacorogna; Laila Elbahtouri, Marie Kratz Pages: 433 - 454 Abstract: Validation of risk models is required by regulators and demanded by management and shareholders. Those models rely in practice heavily on Monte Carlo (MC) simulations. Given their complexity, the convergence of the MC algorithm is difficult to prove mathematically. To circumvent this problem and nevertheless explore the conditions of convergence, we suggest an analytical approach. Considering standard models, we compute, via mixing techniques, closed form formulas for risk measures as Value-at-Risk (VaR) VaR or Tail Value-at-Risk (TVaR) TVaR on a portfolio of risks, and consequently for the associated diversification benefit. The numerical convergence of MC simulations of those various quantities is then tested against their analytical evaluations. The speed of convergence appears to depend on the fatness of the tail of the marginal distributions; the higher the tail index, the faster the convergence. We also explore the behaviour of the diversification benefit with various dependence structures and marginals (heavy and light tails). As expected, it varies heavily with the type of dependence between aggregated risks. The diversification benefit is also studied as a function of the risk measure, VaR or TVaR. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499517000227 Issue No:Vol. 12, No. 2 (2018)

Authors:Søren Asmussen Pages: 455 - 478 Abstract: Conditional Monte Carlo replaces a naive estimate Z of a number z by its conditional expectation given a suitable piece of information. It always reduces variance and its traditional applications are in that vein. We survey here other potential uses such as density estimation and calculations for Value-at-Risk and/or expected shortfall, going in part into the implementation in various copula structures. Also the interplay between these different aspects comes into play. PubDate: 2018-09-01T00:00:00.000Z DOI: 10.1017/S1748499517000252 Issue No:Vol. 12, No. 2 (2018)