Posted by Albert Lee on 10/18/13 9:10 AM

Find me on:

--------------------------------------------------------

Incidences and severity are the two basic components of any loss model. Loss Distributions by Robert V. Hogg and Stuart A. Klugman (Wiley, 1984) provides in-depth coverage of relevant topics. I first encountered this book during a recent visit at Milliman’s Milwaukee office. Its usefulness to our work at Summit was immediately apparent.

The first chapter provides basic terminology used by actuaries when discussing loss modeling. Chapter 2 discusses several useful probability distributions, both discrete and continuous. I find the chapters are a good review of basic probability theory. Going beyond basic review is its treatment of mixture models (Section 2.7). This section discusses how to combine different distributions (e.g., log-gamma and gamma distribution) into a single distribution that mimics observed characteristics (especially skewed loss distributions).

Chapter 3 discusses non-parametric analogs and their estimation techniques. It starts off with the familiar territory of point estimates and confidence intervals for inferences. It quickly moves into more advanced treatments of logged likelihood estimation and Bayesian estimators. These discussions provide a foundation for the crux of this text, which begins at the next chapter.

Chapter 4, “Modeling Loss Distributions”, is the heart of this book. It discusses a variety of data pathologies that are typical at loss data (e.g., truncation and censoring) and modifies estimation techniques to different types of data (e.g., grouped and clustered). In addition to statistical techniques, this chapter provides examples of best practices of loss data exhibits (in tabular form and graphics). Section 4.7 is a review of the modeling process, which breaks down loss modeling into a sequence of steps in the form of a flow chart.

Ever wonder how insurance companies calculate deductibles? Chapter 5 shows you the tricks of the trade. This short chapter combines all the aforementioned techniques into a series of applications (estimating percentiles, deductibles, and leveraging).

This book is a classic; its usefulness has not been diminished with the advent of more advanced estimation techniques. It provides language that bridges the gap between statisticians and actuaries. For some, it provides a good review of probability theories and applications. I suspect that others may find its ideas immediately useful for loss modeling, including credit subsidy portfolio modeling, loan loss reserve calculation, insurance fee calculator. For enforcement analytics, the text contains useful ideas for the estimation of major enforcement cases and monetary results, just to name a few applications.

Topics: Summit Blog, data analytics

Complexity simplified.

Summit is a specialized analytics advisory firm that guides clients as they decode their most complex analytical challenges. Our blog highlights the strategies and techniques we use, as well as relevant topics in current events.