Matching Items (21)
Filtering by

Clear all filters

190981-Thumbnail Image.png
Description
As the impacts of climate change worsen in the coming decades, natural hazards are expected to increase in frequency and intensity, leading to increased loss and risk to human livelihood. The spatio-temporal statistical approaches developed and applied in this dissertation highlight the ways in which hazard data can be leveraged

As the impacts of climate change worsen in the coming decades, natural hazards are expected to increase in frequency and intensity, leading to increased loss and risk to human livelihood. The spatio-temporal statistical approaches developed and applied in this dissertation highlight the ways in which hazard data can be leveraged to understand loss trends, build forecasts, and study societal impacts of losses. Specifically, this work makes use of the Spatial Hazard Events and Losses Database which is an unparalleled source of loss data for the United States. The first portion of this dissertation develops accurate loss baselines that are crucial for mitigation planning, infrastructure investment, and risk communication. This is accomplished thorough a stationarity analysis of county level losses following a normalization procedure. A wide variety of studies employ loss data without addressing stationarity assumptions or the possibility for spurious regression. This work enables the statistically rigorous application of such loss time series to modeling applications. The second portion of this work develops a novel matrix variate dynamic factor model for spatio-temporal loss data stratified across multiple correlated hazards or perils. The developed model is employed to analyze and forecast losses from convective storms, which constitute some of the highest losses covered by insurers. Adopting factor-based approach, forecasts are achieved despite the complex and often unobserved underlying drivers of these losses. The developed methodology extends the literature on dynamic factor models to matrix variate time series. Specifically, a covariance structure is imposed that is well suited to spatio-temporal problems while significantly reducing model complexity. The model is fit via the EM algorithm and Kalman filter. The third and final part of this dissertation investigates the impact of compounding hazard events on state and regional migration in the United States. Any attempt to capture trends in climate related migration must account for the inherent uncertainties surrounding climate change, natural hazard occurrences, and socioeconomic factors. For this reason, I adopt a Bayesian modeling approach that enables the explicit estimation of the inherent uncertainty. This work can provide decision-makers with greater clarity regarding the extent of knowledge on climate trends.
ContributorsBoyle, Esther Sarai (Author) / Jevtic, Petar (Thesis advisor) / Lanchier, Nicolas (Thesis advisor) / Lan, Shiwei (Committee member) / Cheng, Dan (Committee member) / Fricks, John (Committee member) / Gall, Melanie (Committee member) / Cutter, Susan (Committee member) / McNicholas, Paul (Committee member) / Arizona State University (Publisher)
Created2023
190865-Thumbnail Image.png
Description
This dissertation centers on treatment effect estimation in the field of causal inference, and aims to expand the toolkit for effect estimation when the treatment variable is binary. Two new stochastic tree-ensemble methods for treatment effect estimation in the continuous outcome setting are presented. The Accelerated Bayesian Causal Forrest (XBCF)

This dissertation centers on treatment effect estimation in the field of causal inference, and aims to expand the toolkit for effect estimation when the treatment variable is binary. Two new stochastic tree-ensemble methods for treatment effect estimation in the continuous outcome setting are presented. The Accelerated Bayesian Causal Forrest (XBCF) model handles variance via a group-specific parameter, and the Heteroskedastic version of XBCF (H-XBCF) uses a separate tree ensemble to learn covariate-dependent variance. This work also contributes to the field of survival analysis by proposing a new framework for estimating survival probabilities via density regression. Within this framework, the Heteroskedastic Accelerated Bayesian Additive Regression Trees (H-XBART) model, which is also developed as part of this work, is utilized in treatment effect estimation for right-censored survival outcomes. All models have been implemented as part of the XBART R package, and their performance is evaluated via extensive simulation studies with appropriate sets of comparators. The contributed methods achieve similar levels of performance, while being orders of magnitude (sometimes as much as 100x) faster than comparator state-of-the-art methods, thus offering an exciting opportunity for treatment effect estimation in the large data setting.
ContributorsKrantsevich, Nikolay (Author) / Hahn, P Richard (Thesis advisor) / McCulloch, Robert (Committee member) / Zhou, Shuang (Committee member) / Lan, Shiwei (Committee member) / He, Jingyu (Committee member) / Arizona State University (Publisher)
Created2023
171927-Thumbnail Image.png
Description
Tracking disease cases is an essential task in public health; however, tracking the number of cases of a disease may be difficult not every infection can be recorded by public health authorities. Notably, this may happen with whole country measles case reports, even such countries with robust registration systems.

Tracking disease cases is an essential task in public health; however, tracking the number of cases of a disease may be difficult not every infection can be recorded by public health authorities. Notably, this may happen with whole country measles case reports, even such countries with robust registration systems. Eilertson et al. (2019) propose using a state-space model combined with maximum likelihood methods for estimating measles transmission. A Bayesian approach that uses particle Markov Chain Monte Carlo (pMCMC) is proposed to estimate the parameters of the non-linear state-space model developed in Eilertson et al. (2019) and similar previous studies. This dissertation illustrates the performance of this approach by calculating posterior estimates of the model parameters and predictions of the unobserved states in simulations and case studies. Also, Iteration Filtering (IF2) is used as a support method to verify the Bayesian estimation and to inform the selection of prior distributions. In the second half of the thesis, a birth-death process is proposed to model the unobserved population size of a disease vector. This model studies the effect of a disease vector population size on a second affected population. The second population follows a non-homogenous Poisson process when conditioned on the vector process with a transition rate given by a scaled version of the vector population. The observation model also measures a potential threshold event when the host species population size surpasses a certain level yielding a higher transmission rate. A maximum likelihood procedure is developed for this model, which combines particle filtering with the Minorize-Maximization (MM) algorithm and extends the work of Crawford et al. (2014).
ContributorsMartinez Rivera, Wilmer Osvaldo (Author) / Fricks, John (Thesis advisor) / Reiser, Mark (Committee member) / Zhou, Shuang (Committee member) / Cheng, Dan (Committee member) / Lan, Shiwei (Committee member) / Arizona State University (Publisher)
Created2022
168605-Thumbnail Image.png
Description
Recent experimental and mathematical work has shown the interdependence of the rod and cone photoreceptors with the retinal pigment epithelium in maintaining sight. Accelerated intake of glucose into the cones via the theoredoxin-like rod-derived cone viability factor (RdCVF) is needed as aerobic glycolysis is the primary source of energy

Recent experimental and mathematical work has shown the interdependence of the rod and cone photoreceptors with the retinal pigment epithelium in maintaining sight. Accelerated intake of glucose into the cones via the theoredoxin-like rod-derived cone viability factor (RdCVF) is needed as aerobic glycolysis is the primary source of energy production. Reactive oxidative species (ROS) result from the rod and cone metabolism and recent experimental work has shown that the long form of RdCVF (RdCVFL) helps mitigate the negative effects of ROS. In this work I investigate the role of RdCVFL in maintaining the health of the photoreceptors. The results of this mathematical model show the necessity of RdCVFL and also demonstrate additional stable modes that are present in this system. The sensitivity analysis shows the importance of glucose uptake, nutrient levels, and ROS mitigation in maintaining rod and cone health in light-damaged mouse models. Together, these suggest areas on which to focus treatment in order to prolong the photoreceptors, especially in situations where ROS is a contributing factor to their death such as retinitis pigmentosa (RP). A potential treatment with RdCVFL and its effects has never been studied in mathematical models. In this work, I examine an optimal control with the treatment of RdCVFL and mathematically illustrate the potential that this treatment might have for treating degenerative retinal diseases such as RP. Further, I examine optimal controls with the treatment of both RdCVF and RdCVFL in order to mathematically understand the potential that a dual treatment might have for treating degenerative retinal diseases such as RP. The RdCVFL control terms are nonlinear for biological accuracy but this results in the standard general theorems for existence of optimal controls failing to apply. I then linearize these models to have proof of existence of an optimal control. Both nonlinear and linearized control results are compared and reveal similarly substantial savings rates for rods and cones.
ContributorsWifvat, Kathryn (Author) / Camacho, Erika (Thesis advisor) / Wirkus, Stephen (Thesis advisor) / Gardner, Carl (Committee member) / Fricks, John (Committee member) / Kawski, Matthias (Committee member) / Arizona State University (Publisher)
Created2022
189255-Thumbnail Image.png
Description
\begin{abstract}The human immunodeficiency virus (HIV) pandemic, which causes the syndrome of opportunistic infections that characterize the late stage HIV disease, known as the acquired immunodeficiency syndrome (AIDS), remains a major public health challenge to many parts of the world. This dissertation contributes in providing deeper qualitative insights into the transmission

\begin{abstract}The human immunodeficiency virus (HIV) pandemic, which causes the syndrome of opportunistic infections that characterize the late stage HIV disease, known as the acquired immunodeficiency syndrome (AIDS), remains a major public health challenge to many parts of the world. This dissertation contributes in providing deeper qualitative insights into the transmission dynamics and control of the HIV/AIDS disease in Men who have Sex with Men (MSM) community. A new mathematical model (which is relatively basic), which incorporates some of the pertinent aspects of HIV epidemiology and immunology and fitted using the yearly new case data of the MSM population from the State of Arizona, was designed and used to assess the population-level impact of awareness of HIV infection status and condom-based intervention, on the transmission dynamics and control of HIV/AIDS in an MSM community. Conditions for the existence and asymptotic stability of the various equilibria ofthe model were derived. The numerical simulations showed that the prospects for the effective control and/or elimination of HIV/AIDS in the MSM community in the United States are very promising using a condom-based intervention, provided the condom efficacy is high and the compliance is moderate enough. The model was extended in Chapter 3 to account for the effect of risk-structure, staged-progression property of HIV disease, and the use of pre-exposure prophylaxis (PrEP) on the spread and control of the disease. The model was shown to undergo a PrEP-induced \textit{backward bifurcation} when the associated control reproduction number is less than one. It was shown that when the compliance in PrEP usage is $50%(80%)$ then about $19.1%(34.2%)$ of the yearly new HIV/AIDS cases recorded at the peak will have been prevented, in comparison to the worst-case scenario where PrEP-based intervention is not implemented in the MSM community. It was also shown that the HIV pandemic elimination is possible from the MSM community even for the scenario when the effective contact rate is increased by 5-fold from its baseline value, if low-risk individuals take at least 15 years before they change their risky behavior and transition to the high-risk group (regardless of the value of the transition rate from high-risk to low-risk susceptible population).
ContributorsTollett, Queen Wiggs (Author) / Gumel, Abba (Thesis advisor) / Crook, Sharon (Committee member) / Fricks, John (Committee member) / Gardner, Carl (Committee member) / Nagy, John (Committee member) / Arizona State University (Publisher)
Created2023
187808-Thumbnail Image.png
Description
This dissertation covers several topics in machine learning and causal inference. First, the question of “feature selection,” a common byproduct of regularized machine learning methods, is investigated theoretically in the context of treatment effect estimation. This involves a detailed review and extension of frameworks for estimating causal effects and in-depth

This dissertation covers several topics in machine learning and causal inference. First, the question of “feature selection,” a common byproduct of regularized machine learning methods, is investigated theoretically in the context of treatment effect estimation. This involves a detailed review and extension of frameworks for estimating causal effects and in-depth theoretical study. Next, various computational approaches to estimating causal effects with machine learning methods are compared with these theoretical desiderata in mind. Several improvements to current methods for causal machine learning are identified and compelling angles for further study are pinpointed. Finally, a common method used for “explaining” predictions of machine learning algorithms, SHAP, is evaluated critically through a statistical lens.
ContributorsHerren, Andrew (Author) / Hahn, P Richard (Thesis advisor) / Kao, Ming-Hung (Committee member) / Lopes, Hedibert (Committee member) / McCulloch, Robert (Committee member) / Zhou, Shuang (Committee member) / Arizona State University (Publisher)
Created2023
189356-Thumbnail Image.png
Description
This dissertation comprises two projects: (i) Multiple testing of local maxima for detection of peaks and change points with non-stationary noise, and (ii) Height distributions of critical points of smooth isotropic Gaussian fields: computations, simulations and asymptotics. The first project introduces a topological multiple testing method for one-dimensional domains to

This dissertation comprises two projects: (i) Multiple testing of local maxima for detection of peaks and change points with non-stationary noise, and (ii) Height distributions of critical points of smooth isotropic Gaussian fields: computations, simulations and asymptotics. The first project introduces a topological multiple testing method for one-dimensional domains to detect signals in the presence of non-stationary Gaussian noise. The approach involves conducting tests at local maxima based on two observation conditions: (i) the noise is smooth with unit variance and (ii) the noise is not smooth where kernel smoothing is applied to increase the signal-to-noise ratio (SNR). The smoothed signals are then standardized, which ensures that the variance of the new sequence's noise becomes one, making it possible to calculate $p$-values for all local maxima using random field theory. Assuming unimodal true signals with finite support and non-stationary Gaussian noise that can be repeatedly observed. The algorithm introduced in this work, demonstrates asymptotic strong control of the False Discovery Rate (FDR) and power consistency as the number of sequence repetitions and signal strength increase. Simulations indicate that FDR levels can also be controlled under non-asymptotic conditions with finite repetitions. The application of this algorithm to change point detection also guarantees FDR control and power consistency. The second project focuses on investigating the explicit and asymptotic height densities of critical points of smooth isotropic Gaussian random fields on both Euclidean space and spheres.The formulae are based on characterizing the distribution of the Hessian of the Gaussian field using the Gaussian orthogonally invariant (GOI) matrices and the Gaussian orthogonal ensemble (GOE) matrices, which are special cases of GOI matrices. However, as the dimension increases, calculating explicit formulae becomes computationally challenging. The project includes two simulation methods for these distributions. Additionally, asymptotic distributions are obtained by utilizing the asymptotic distribution of the eigenvalues (excluding the maximum eigenvalues) of the GOE matrix for large dimensions. However, when it comes to the maximum eigenvalue, the Tracy-Widom distribution is utilized. Simulation results demonstrate the close approximation between the asymptotic distribution and the real distribution when $N$ is sufficiently large.
Contributorsgu, shuang (Author) / Cheng, Dan (Thesis advisor) / Lopes, Hedibert (Committee member) / Fricks, John (Committee member) / Lan, Shiwei (Committee member) / Zheng, Yi (Committee member) / Arizona State University (Publisher)
Created2023
193502-Thumbnail Image.png
Description
In the realm of discrete ill-posed problems, image deblurring is a challenging problem aimed at restoring clear and visually appealing images from their blurred counterparts. Over the years, various numerical techniques have been developed to solve this problem, each offering unique approaches to tackle blurring and noise.This thesis studies multilevel

In the realm of discrete ill-posed problems, image deblurring is a challenging problem aimed at restoring clear and visually appealing images from their blurred counterparts. Over the years, various numerical techniques have been developed to solve this problem, each offering unique approaches to tackle blurring and noise.This thesis studies multilevel methods using Daubechies wavelets and Tikhonov regularization. The Daubechies wavelets are a family of orthogonal wavelets widely used in various fields because of their orthogonality and compact support. They have been widely applied in signal processing, image compression, and other applications. One key aspect of this investigation involves a comprehensive comparative analysis with Krylov methods, well-established iterative methods known for their efficiency and accuracy in solving large-scale inverse problems. The focus is on two well-known Krylov methods, namely hybrid LSQR and hybrid generalized minimal residual method \linebreak(GMRES). By contrasting the multilevel and Krylov methods, the aim is to discern their strengths and limitations, facilitating a deeper understanding of their applicability in diverse image-deblurring scenarios. Other critical comparison factors are the noise level adopted during the deblurring process and the amount of blur. To gauge their robustness and performance under different blurry and noisy conditions, this work explores how each method behaves with different noise levels from mild to severe and different amounts of blur from small to large. Moreover, this thesis combines multilevel and Krylov methods to test a new method for solving inverse problems. This work aims to provide valuable insights into the strengths and weaknesses of these multilevel Krylov methods by shedding light on their efficacy. Ultimately, the findings could have implications across diverse domains, including medical imaging, remote sensing, and multimedia applications, where high-quality and noise-free images are indispensable for accurate analysis and interpretation.
ContributorsAmdouni, Bechir (Author) / Espanol, Malena (Thesis advisor) / Renaut, Rosemary (Committee member) / Platte, Rodrigo (Committee member) / Fricks, John (Committee member) / Moustaoui, Mohamed (Committee member) / Arizona State University (Publisher)
Created2024
187415-Thumbnail Image.png
Description
A pneumonia-like illness emerged late in 2019 (coined COVID-19), caused by SARSCoV-2, causing a devastating global pandemic on a scale never before seen sincethe 1918/1919 influenza pandemic. This dissertation contributes in providing deeper qualitative insights into the transmission dynamics and control of the disease in the United States. A basic mathematical model,

A pneumonia-like illness emerged late in 2019 (coined COVID-19), caused by SARSCoV-2, causing a devastating global pandemic on a scale never before seen sincethe 1918/1919 influenza pandemic. This dissertation contributes in providing deeper qualitative insights into the transmission dynamics and control of the disease in the United States. A basic mathematical model, which incorporates the key pertinent epidemiological features of SARS-CoV-2 and fitted using observed COVID-19 data, was designed and used to assess the population-level impacts of vaccination and face mask usage in mitigating the burden of the pandemic in the United States. Conditions for the existence and asymptotic stability of the various equilibria of the model were derived. The model was shown to undergo a vaccine-induced backward bifurcation when the associated reproduction number is less than one. Conditions for achieving vaccine-derived herd immunity were derived for three of the four FDA-approved vaccines (namely Pfizer, Moderna and Johnson & Johnson vaccine), and the vaccination coverage level needed to achieve it decreases with increasing coverage of moderately and highly-effective face masks. It was also shown that using face masks as a singular intervention strategy could lead to the elimination of the pandemic if moderate or highly-effective masks are prioritized and pandemic elimination prospects are greatly enhanced if the vaccination program is combined with a face mask use strategy that emphasizes the use of moderate to highly-effective masks with at least moderate coverage. The model was extended in Chapter 3 to allow for the assessment of the impacts of waning and boosting of vaccine-derived and natural immunity against the BA.1 Omicron variant of SARS-CoV-2. It was shown that vaccine-derived herd immunity can be achieved in the United States via a vaccination-boosting strategy which entails fully vaccinating at least 72% of the susceptible populace. Boosting of vaccine-derived immunity was shown to be more beneficial than boosting of natural immunity. Overall, this study showed that the prospects of the elimination of the pandemic in the United States were highly promising using the two intervention measures.
ContributorsSafdar, Salman (Author) / Gumel, Abba (Thesis advisor) / Kostelich, Eric (Committee member) / Kang, Yun (Committee member) / Fricks, John (Committee member) / Espanol, Malena (Committee member) / Arizona State University (Publisher)
Created2023
162238-Thumbnail Image.png
DescriptionUnderstanding the evolution of opinions is a delicate task as the dynamics of how one changes their opinion based on their interactions with others are unclear.
ContributorsWeber, Dylan (Author) / Motsch, Sebastien (Thesis advisor) / Lanchier, Nicolas (Committee member) / Platte, Rodrigo (Committee member) / Armbruster, Dieter (Committee member) / Fricks, John (Committee member) / Arizona State University (Publisher)
Created2021