Matching Items (8)
Filtering by

Clear all filters

133413-Thumbnail Image.png
Description
Catastrophe events occur rather infrequently, but upon their occurrence, can lead to colossal losses for insurance companies. Due to their size and volatility, catastrophe losses are often treated separately from other insurance losses. In fact, many property and casualty insurance companies feature a department or team which focuses solely on

Catastrophe events occur rather infrequently, but upon their occurrence, can lead to colossal losses for insurance companies. Due to their size and volatility, catastrophe losses are often treated separately from other insurance losses. In fact, many property and casualty insurance companies feature a department or team which focuses solely on modeling catastrophes. Setting reserves for catastrophe losses is difficult due to their unpredictable and often long-tailed nature. Determining loss development factors (LDFs) to estimate the ultimate loss amounts for catastrophe events is one method for setting reserves. In an attempt to aid Company XYZ set more accurate reserves, the research conducted focuses on estimating LDFs for catastrophes which have already occurred and have been settled. Furthermore, the research describes the process used to build a linear model in R to estimate LDFs for Company XYZ's closed catastrophe claims from 2001 \u2014 2016. This linear model was used to predict a catastrophe's LDFs based on the age in weeks of the catastrophe during the first year. Back testing was also performed, as was the comparison between the estimated ultimate losses and actual losses. Future research consideration was proposed.
ContributorsSwoverland, Robert Bo (Author) / Milovanovic, Jelena (Thesis director) / Zicarelli, John (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
135725-Thumbnail Image.png
Description
A distributed sensor network (DSN) is a set of spatially scattered intelligent sensors designed to obtain data across an environment. DSNs are becoming a standard architecture for collecting data over a large area. We need registration of nodal data across the network in order to properly exploit having multiple sensors.

A distributed sensor network (DSN) is a set of spatially scattered intelligent sensors designed to obtain data across an environment. DSNs are becoming a standard architecture for collecting data over a large area. We need registration of nodal data across the network in order to properly exploit having multiple sensors. One major problem worth investigating is ensuring the integrity of the data received, such as time synchronization. Consider a group of match filter sensors. Each sensor is collecting the same data, and comparing the data collected to a known signal. In an ideal world, each sensor would be able to collect the data without offsets or noise in the system. Two models can be followed from this. First, each sensor could make a decision on its own, and then the decisions could be collected at a ``fusion center'' which could then decide if the signal is present or not. The fusion center can then decide if the signal is present or not based on the number true-or-false decisions that each sensor has made. Alternatively, each sensor could relay the data that it collects to the fusion center, and it could then make a decision based on all of the data that it then receives. Since the fusion center would have more information to base its decision on in the latter case--as opposed to the former case where it only receives a true or false from each sensor--one would expect the latter model to perform better. In fact, this would be the gold standard for detection across a DSN. However, there is random noise in the world that causes corruption of data collection, especially among sensors in a DSN. Each sensor does not collect the data in the exact same way or with the same precision. We classify these imperfections in data collections as offsets, specifically the offset present in the data collected by one sensor with respect to the rest of the sensors in the network. Therefore, reconsider the two models for a DSN described above. We can naively implement either of these models for data collection. Alternatively, we can attempt to estimate the offsets between the sensors and compensate. One could see how it would be expected that estimating the offsets within the DSN would provide better overall results than not finding estimators. This thesis will be structured as follows. First, there will be an extensive investigation into detection theory and the impact that different types of offsets have on sensor networks. Following the theory, an algorithm for estimating the data offsets will be proposed correct for the offsets. Next, we will look at Monte Carlo simulation results to see the impact on sensor performance of data offsets in comparison to a sensor network without offsets present. The algorithm is then implemented, and further experiments will demonstrate sensor performance with offset detection.
ContributorsMonardo, Vincent James (Author) / Cochran, Douglas (Thesis director) / Kierstead, Hal (Committee member) / Electrical Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
147605-Thumbnail Image.png
Description

This thesis details the design process of a variable gain amplifier (VGA) based circuit which maintains a consistent output power over a wide range of input power signals. This effect is achieved by using power detection circuitry to adjust the gain of the VGA based on the current input power

This thesis details the design process of a variable gain amplifier (VGA) based circuit which maintains a consistent output power over a wide range of input power signals. This effect is achieved by using power detection circuitry to adjust the gain of the VGA based on the current input power so that it is amplifier to a set power level. The paper details the theory behind this solutions as well as the design process which includes both simulations and physical testing of the actual circuit. It also analyses results of these tests and gives suggestions as to what could be done to further improve the design. The VGA based constant output power solution was designed as a section of a larger circuit which was developed as part of a senior capstone project, which is also briefly described in the paper.

ContributorsMeyer, Sheldon (Author) / Aberle, James (Thesis director) / Chakraborty, Partha (Committee member) / Electrical Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
132267-Thumbnail Image.png
Description
AARP estimates that 90% of seniors wish to remain in their homes during retirement. Seniors need assistance as they age, historically they have received assistance from either family members, nursing homes, or Continuing Care Retirement Communities. For seniors not wanting any of these options, there has been very few alternatives.

AARP estimates that 90% of seniors wish to remain in their homes during retirement. Seniors need assistance as they age, historically they have received assistance from either family members, nursing homes, or Continuing Care Retirement Communities. For seniors not wanting any of these options, there has been very few alternatives. Now, the emergence of the continuing care at home program is providing hope for a different method of elder care moving forward. CCaH programs offer services such as: skilled nursing care, care coordination, emergency response systems, aid with personal and health care, and transportation. Such services allow seniors to continue to live in their own home with assistance as their health deteriorates over time. Currently, only 30 CCaH programs exist. With the growth of the elderly population in the coming years, this model seems poised for growth.
ContributorsSturm, Brendan (Author) / Milovanovic, Jelena (Thesis director) / Hassett, Matthew (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Economics Program in CLAS (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132421-Thumbnail Image.png
Description
The objective of this paper is to find and describe trends in the fast Fourier transformed accelerometer data that can be used to predict the mechanical failure of large vacuum pumps used in industrial settings, such as providing drinking water. Using three-dimensional plots of the data, this paper suggests how

The objective of this paper is to find and describe trends in the fast Fourier transformed accelerometer data that can be used to predict the mechanical failure of large vacuum pumps used in industrial settings, such as providing drinking water. Using three-dimensional plots of the data, this paper suggests how a model can be developed to predict the mechanical failure of vacuum pumps.
ContributorsHalver, Grant (Author) / Taylor, Tom (Thesis director) / Konstantinos, Tsakalis (Committee member) / Fricks, John (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132515-Thumbnail Image.png
Description
This Creative Project was carried out in coordination with the capstone project, Around the Corner Imaging with Terahertz Waves. This capstone project deals with a system designed to implement Around the Corner, or Non Line-of-Sight (NLoS) Imaging. This document discusses the creation of a GUI using MATLAB to control the

This Creative Project was carried out in coordination with the capstone project, Around the Corner Imaging with Terahertz Waves. This capstone project deals with a system designed to implement Around the Corner, or Non Line-of-Sight (NLoS) Imaging. This document discusses the creation of a GUI using MATLAB to control the Terahertz Imaging system. The GUI was developed in response to a need for synchronization, ease of operation, easy parameter modification, and data management. Along the way, many design decisions were made ranging from choosing a software platform to determining how variables should be passed. These decisions and considerations are discussed in this document. The resulting GUI has measured up to the design criteria and will be able to be used by anyone wishing to use the Terahertz Imaging System for further research in the field of Around the Corner or NLoS Imaging.
ContributorsWood, Jacob Cannon (Author) / Trichopoulos, Georgios (Thesis director) / Aberle, James (Committee member) / Electrical Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
165134-Thumbnail Image.png
Description
A factor accounting for the COVID-19 pandemic was added to a generalized linear model to more accurately predict unpaid claims. COVID-19 has affected not just healthcare, but all sectors of the economy. Because of this, whether or not an automobile insurance claim is filed during the pandemic needs to be

A factor accounting for the COVID-19 pandemic was added to a generalized linear model to more accurately predict unpaid claims. COVID-19 has affected not just healthcare, but all sectors of the economy. Because of this, whether or not an automobile insurance claim is filed during the pandemic needs to be taken into account while estimating unpaid claims. Reserve-estimating functions such as glmReserve from the “ChainLadder” package in the statistical software R were experimented with to produce their own results. Because of their insufficiency, a manual approach to building the model turned out to be the most proficient method. Utilizing the GLM function, a model was built that emulated linear regression with a factor for COVID-19. The effects of such a model are analyzed based on effectiveness and interpretablility. A model such as this would prove useful for future calculations, especially as society is now returning to a “normal” state.
ContributorsKossler, Patrick (Author) / Zicarelli, John (Thesis director) / Milovanovic, Jelena (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2022-05
165923-Thumbnail Image.png
Description

The objective of this study is to build a model using R and RStudio that automates ratemaking procedures for Company XYZ’s actuaries in their commercial general liability pricing department. The purpose and importance of this objective is to allow actuaries to work more efficiently and effectively by using this model

The objective of this study is to build a model using R and RStudio that automates ratemaking procedures for Company XYZ’s actuaries in their commercial general liability pricing department. The purpose and importance of this objective is to allow actuaries to work more efficiently and effectively by using this model that outputs the results they otherwise would have had to code and calculate on their own. Instead of spending time working towards these results, the actuaries can analyze the findings, strategize accordingly, and communicate with business partners. The model was built from R code that was later transformed to Shiny, a package within RStudio that allows for the build-up of interactive web applications. The final result is a Shiny app that first takes in multiple datasets from Company XYZ’s data warehouse and displays different views of the data in order for actuaries to make selections on development and trend methods. The app outputs the re-created ratemaking exhibits showing the resulting developed and trended loss and premium as well as the experience-based indicated rate level change based on prior selections. The ratemaking process and Shiny app functionality will be detailed in this report.

ContributorsGilkey, Gina (Author) / Zicarelli, John (Thesis director) / Milovanovic, Jelena (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2022-05