Matching Items (18)
Filtering by

Clear all filters

150135-Thumbnail Image.png
Description
It is common in the analysis of data to provide a goodness-of-fit test to assess the performance of a model. In the analysis of contingency tables, goodness-of-fit statistics are frequently employed when modeling social science, educational or psychological data where the interest is often directed at investigating the association among

It is common in the analysis of data to provide a goodness-of-fit test to assess the performance of a model. In the analysis of contingency tables, goodness-of-fit statistics are frequently employed when modeling social science, educational or psychological data where the interest is often directed at investigating the association among multi-categorical variables. Pearson's chi-squared statistic is well-known in goodness-of-fit testing, but it is sometimes considered to produce an omnibus test as it gives little guidance to the source of poor fit once the null hypothesis is rejected. However, its components can provide powerful directional tests. In this dissertation, orthogonal components are used to develop goodness-of-fit tests for models fit to the counts obtained from the cross-classification of multi-category dependent variables. Ordinal categories are assumed. Orthogonal components defined on marginals are obtained when analyzing multi-dimensional contingency tables through the use of the QR decomposition. A subset of these orthogonal components can be used to construct limited-information tests that allow one to identify the source of lack-of-fit and provide an increase in power compared to Pearson's test. These tests can address the adverse effects presented when data are sparse. The tests rely on the set of first- and second-order marginals jointly, the set of second-order marginals only, and the random forest method, a popular algorithm for modeling large complex data sets. The performance of these tests is compared to the likelihood ratio test as well as to tests based on orthogonal polynomial components. The derived goodness-of-fit tests are evaluated with studies for detecting two- and three-way associations that are not accounted for by a categorical variable factor model with a single latent variable. In addition the tests are used to investigate the case when the model misspecification involves parameter constraints for large and sparse contingency tables. The methodology proposed here is applied to data from the 38th round of the State Survey conducted by the Institute for Public Policy and Michigan State University Social Research (2005) . The results illustrate the use of the proposed techniques in the context of a sparse data set.
ContributorsMilovanovic, Jelena (Author) / Young, Dennis (Thesis advisor) / Reiser, Mark R. (Thesis advisor) / Wilson, Jeffrey (Committee member) / Eubank, Randall (Committee member) / Yang, Yan (Committee member) / Arizona State University (Publisher)
Created2011
136625-Thumbnail Image.png
Description
A Guide to Financial Mathematics is a comprehensive and easy-to-use study guide for students studying for the one of the first actuarial exams, Exam FM. While there are many resources available to students to study for these exams, this study is free to the students and offers an approach to

A Guide to Financial Mathematics is a comprehensive and easy-to-use study guide for students studying for the one of the first actuarial exams, Exam FM. While there are many resources available to students to study for these exams, this study is free to the students and offers an approach to the material similar to that of which is presented in class at ASU. The guide is available to students and professors in the new Actuarial Science degree program offered by ASU. There are twelve chapters, including financial calculator tips, detailed notes, examples, and practice exercises. Included at the end of the guide is a list of referenced material.
ContributorsDougher, Caroline Marie (Author) / Milovanovic, Jelena (Thesis director) / Boggess, May (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
133403-Thumbnail Image.png
Description
The use of generalized linear models in loss reserving is not new; many statistical models have been developed to fit the loss data gathered by various insurance companies. The most popular models belong to what Glen Barnett and Ben Zehnwirth in "Best Estimates for Reserves" call the "extended link ratio

The use of generalized linear models in loss reserving is not new; many statistical models have been developed to fit the loss data gathered by various insurance companies. The most popular models belong to what Glen Barnett and Ben Zehnwirth in "Best Estimates for Reserves" call the "extended link ratio family (ELRF)," as they are developed from the chain ladder algorithm used by actuaries to estimate unpaid claims. Although these models are intuitive and easy to implement, they are nevertheless flawed because many of the assumptions behind the models do not hold true when fitted with real-world data. Even more problematically, the ELRF cannot account for environmental changes like inflation which are often observed in the status quo. Barnett and Zehnwirth conclude that a new set of models that contain parameters for not only accident year and development period trends but also payment year trends would be a more accurate predictor of loss development. This research applies the paper's ideas to data gathered by Company XYZ. The data was fitted with an adapted version of Barnett and Zehnwirth's new model in R, and a trend selection algorithm was developed to accompany the regression code. The final forecasts were compared to Company XYZ's booked reserves to evaluate the predictive power of the model.
ContributorsZhang, Zhihan Jennifer (Author) / Milovanovic, Jelena (Thesis director) / Tomita, Melissa (Committee member) / Zicarelli, John (Committee member) / W.P. Carey School of Business (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133413-Thumbnail Image.png
Description
Catastrophe events occur rather infrequently, but upon their occurrence, can lead to colossal losses for insurance companies. Due to their size and volatility, catastrophe losses are often treated separately from other insurance losses. In fact, many property and casualty insurance companies feature a department or team which focuses solely on

Catastrophe events occur rather infrequently, but upon their occurrence, can lead to colossal losses for insurance companies. Due to their size and volatility, catastrophe losses are often treated separately from other insurance losses. In fact, many property and casualty insurance companies feature a department or team which focuses solely on modeling catastrophes. Setting reserves for catastrophe losses is difficult due to their unpredictable and often long-tailed nature. Determining loss development factors (LDFs) to estimate the ultimate loss amounts for catastrophe events is one method for setting reserves. In an attempt to aid Company XYZ set more accurate reserves, the research conducted focuses on estimating LDFs for catastrophes which have already occurred and have been settled. Furthermore, the research describes the process used to build a linear model in R to estimate LDFs for Company XYZ's closed catastrophe claims from 2001 \u2014 2016. This linear model was used to predict a catastrophe's LDFs based on the age in weeks of the catastrophe during the first year. Back testing was also performed, as was the comparison between the estimated ultimate losses and actual losses. Future research consideration was proposed.
ContributorsSwoverland, Robert Bo (Author) / Milovanovic, Jelena (Thesis director) / Zicarelli, John (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
Description
This expository thesis explores the financial health and actuarial analysis of a particular solution for those seeking stability and security in their golden years: the CCRC industry. A continuing care retirement community, or CCRC, is a comprehensive project and campus that offers its residents a full spectrum of care from

This expository thesis explores the financial health and actuarial analysis of a particular solution for those seeking stability and security in their golden years: the CCRC industry. A continuing care retirement community, or CCRC, is a comprehensive project and campus that offers its residents a full spectrum of care from independent living, to assisted living, to skilled nursing. After reading this paper, any person with no prior knowledge of a continuing care retirement community should gain a firm understanding of the background, risks and benefits, and legislative safeguards of this complex industry. Financially, a CCRC operates in some aspects similar to long-term care (LTC) insurance. However, CCRCs provide multiple levels of care operations while maintaining a pleasant, engaging community environment where seniors can have all their lifestyle needs met. The expensive and complex operations of a CCRC are not without risk: the industry has seen marked periods of bankruptcy followed by increasing and changing regulatory oversight. Thus, CCRCs require a periodic actuarial analysis and report, among array of other legislative safeguards against bankruptcy. A CCRC's insolvency or inability to meet its obligations can be catastrophic and inflict suffering and damages not only to its residents but also their friends and families. With seniors historically being one of the most vulnerable demographic groups, it is absolutely essential that an all-encompassing care facility continues to exist and fulfill its contractual promises by maintaining sound actuarial practices and financial health. This thesis, in addition to providing an exposition of the background and functions of the CCRC, describes the existing actuarial and financial studies and audits in practice to ensure sound governance and the quality of life of CCRC residents.
ContributorsTang, Julie (Author) / Milovanovic, Jelena (Thesis director) / Hassett, Matthew J. (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
Description

The Mack model and the Bootstrap Over-Dispersed Poisson model have long been the primary modeling tools used by actuaries and insurers to forecast losses. With the emergence of faster computational technology, new and novel methods to calculate and simulate data are more applicable than ever before. This paper explores the

The Mack model and the Bootstrap Over-Dispersed Poisson model have long been the primary modeling tools used by actuaries and insurers to forecast losses. With the emergence of faster computational technology, new and novel methods to calculate and simulate data are more applicable than ever before. This paper explores the use of various Bayesian Monte Carlo Markov Chain models recommended by Glenn Meyers and compares the results to the simulated data from the Mack model and the Bootstrap Over-Dispersed Poisson model. Although the Mack model and the Bootstrap Over-Dispersed Poisson model are accurate to a certain degree, newer models could be developed that may yield better results. However, a general concern is that no singular model is able to reflect underlying information that only an individual who has intimate knowledge of the data would know. Thus, the purpose of this paper is not to distinguish one model that works for all applicable data, but to propose various models that have pros and cons and suggest ways that they can be improved upon.

ContributorsZhang, Zhaobo (Author) / Zicarelli, John (Thesis director) / Milovanovic, Jelena (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2023-05
Description

Regulation in the insurance market has increased greatly over the past four decades, and recent regulatory frameworks such as Solvency II have made simulations increasingly important. Monte Carlo simulations are often too inefficient to be used by themselves, and these Monte Carlo simulations begin to struggle when the complexity of

Regulation in the insurance market has increased greatly over the past four decades, and recent regulatory frameworks such as Solvency II have made simulations increasingly important. Monte Carlo simulations are often too inefficient to be used by themselves, and these Monte Carlo simulations begin to struggle when the complexity of insurance contracts increases. For that reason, there have been numerous suggested improvements to traditional MC methods such as the sample recycling method and a neural network method. This thesis will review various risk measures, the methods used to calculate them, and a detailed analysis of the neural network method and the sample recycling method. The sample recycling method and the neural network method will then be analyzed in detail, and a comparative analysis of the sample recycling method and the neural network method will be given. It was discovered that both the sample recycling method and the neural network method provide a large improvement in computational cost and overall run time with minor impacts on the accuracy. Thus, it was concluded that the sample recycling method is best suited for contracts where the inner loop estimations are particularly complex and the neural network is a general method that pairs well with complex input portfolios.

ContributorsWesten, Ron (Author) / Zhou, Kenneth (Thesis director) / Milovanovic, Jelena (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2023-05
Description

An examination of various reserving methods and their application in commercial auto insurance. Seeks to answer two questions: Which is the best model, out of the Chain Ladder, Mack Chain Ladder, Munich Chain Ladder, Clark's LDF and Clark's Cape Cod methods? Which loss basis, paid or incurred, yields better reserves?

ContributorsLindgren, Connor (Author) / Zicarelli, John (Thesis director) / Milovanovic, Jelena (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2022-12
DescriptionAn examination of various reserving methods and their application in commercial auto insurance. Seeks to answer two questions: Which is the best model, out of the Chain Ladder, Mack Chain Ladder, Munich Chain Ladder, Clark's LDF and Clark's Cape Cod methods? Which loss basis, paid or incurred, yields better reserves?
ContributorsLindgren, Connor (Author) / Zicarelli, John (Thesis director) / Milovanovic, Jelena (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2022-12
DescriptionAn examination of various reserving methods and their application in commercial auto insurance. Seeks to answer two questions: Which is the best model, out of the Chain Ladder, Mack Chain Ladder, Munich Chain Ladder, Clark's LDF and Clark's Cape Cod methods? Which loss basis, paid or incurred, yields better reserves?
ContributorsLindgren, Connor (Author) / Zicarelli, John (Thesis director) / Milovanovic, Jelena (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2022-12