Matching Items (114)
149730-Thumbnail Image.png
Description
Nonlinear dispersive equations model nonlinear waves in a wide range of physical and mathematics contexts. They reinforce or dissipate effects of linear dispersion and nonlinear interactions, and thus, may be of a focusing or defocusing nature. The nonlinear Schrödinger equation or NLS is an example of such equations. It appears

Nonlinear dispersive equations model nonlinear waves in a wide range of physical and mathematics contexts. They reinforce or dissipate effects of linear dispersion and nonlinear interactions, and thus, may be of a focusing or defocusing nature. The nonlinear Schrödinger equation or NLS is an example of such equations. It appears as a model in hydrodynamics, nonlinear optics, quantum condensates, heat pulses in solids and various other nonlinear instability phenomena. In mathematics, one of the interests is to look at the wave interaction: waves propagation with different speeds and/or different directions produces either small perturbations comparable with linear behavior, or creates solitary waves, or even leads to singular solutions. This dissertation studies the global behavior of finite energy solutions to the $d$-dimensional focusing NLS equation, $i partial _t u+Delta u+ |u|^{p-1}u=0, $ with initial data $u_0in H^1,; x in Rn$; the nonlinearity power $p$ and the dimension $d$ are chosen so that the scaling index $s=frac{d}{2}-frac{2}{p-1}$ is between 0 and 1, thus, the NLS is mass-supercritical $(s>0)$ and energy-subcritical $(s<1).$ For solutions with $ME[u_0]<1$ ($ME[u_0]$ stands for an invariant and conserved quantity in terms of the mass and energy of $u_0$), a sharp threshold for scattering and blowup is given. Namely, if the renormalized gradient $g_u$ of a solution $u$ to NLS is initially less than 1, i.e., $g_u(0)<1,$ then the solution exists globally in time and scatters in $H^1$ (approaches some linear Schr"odinger evolution as $ttopminfty$); if the renormalized gradient $g_u(0)>1,$ then the solution exhibits a blowup behavior, that is, either a finite time blowup occurs, or there is a divergence of $H^1$ norm in infinite time. This work generalizes the results for the 3d cubic NLS obtained in a series of papers by Holmer-Roudenko and Duyckaerts-Holmer-Roudenko with the key ingredients, the concentration compactness and localized variance, developed in the context of the energy-critical NLS and Nonlinear Wave equations by Kenig and Merle. One of the difficulties is fractional powers of nonlinearities which are overcome by considering Besov-Strichartz estimates and various fractional differentiation rules.
ContributorsGuevara, Cristi Darley (Author) / Roudenko, Svetlana (Thesis advisor) / Castillo_Chavez, Carlos (Committee member) / Jones, Donald (Committee member) / Mahalov, Alex (Committee member) / Suslov, Sergei (Committee member) / Arizona State University (Publisher)
Created2011
151515-Thumbnail Image.png
Description
This thesis outlines the development of a vector retrieval technique, based on data assimilation, for a coherent Doppler LIDAR (Light Detection and Ranging). A detailed analysis of the Optimal Interpolation (OI) technique for vector retrieval is presented. Through several modifications to the OI technique, it is shown that the modified

This thesis outlines the development of a vector retrieval technique, based on data assimilation, for a coherent Doppler LIDAR (Light Detection and Ranging). A detailed analysis of the Optimal Interpolation (OI) technique for vector retrieval is presented. Through several modifications to the OI technique, it is shown that the modified technique results in significant improvement in velocity retrieval accuracy. These modifications include changes to innovation covariance portioning, covariance binning, and analysis increment calculation. It is observed that the modified technique is able to make retrievals with better accuracy, preserves local information better, and compares well with tower measurements. In order to study the error of representativeness and vector retrieval error, a lidar simulator was constructed. Using the lidar simulator a thorough sensitivity analysis of the lidar measurement process and vector retrieval is carried out. The error of representativeness as a function of scales of motion and sensitivity of vector retrieval to look angle is quantified. Using the modified OI technique, study of nocturnal flow in Owens' Valley, CA was carried out to identify and understand uncharacteristic events on the night of March 27th 2006. Observations from 1030 UTC to 1230 UTC (0230 hr local time to 0430 hr local time) on March 27 2006 are presented. Lidar observations show complex and uncharacteristic flows such as sudden bursts of westerly cross-valley wind mixing with the dominant up-valley wind. Model results from Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS®) and other in-situ instrumentations are used to corroborate and complement these observations. The modified OI technique is used to identify uncharacteristic and extreme flow events at a wind development site. Estimates of turbulence and shear from this technique are compared to tower measurements. A formulation for equivalent wind speed in the presence of variations in wind speed and direction, combined with shear is developed and used to determine wind energy content in presence of turbulence.
ContributorsChoukulkar, Aditya (Author) / Calhoun, Ronald (Thesis advisor) / Mahalov, Alex (Committee member) / Kostelich, Eric (Committee member) / Huang, Huei-Ping (Committee member) / Phelan, Patrick (Committee member) / Arizona State University (Publisher)
Created2013
Description
It is possible in a properly controlled environment, such as industrial metrology, to make significant headway into the non-industrial constraints on image-based position measurement using the techniques of image registration and achieve repeatable feature measurements on the order of 0.3% of a pixel, or about an order of magnitude improvement

It is possible in a properly controlled environment, such as industrial metrology, to make significant headway into the non-industrial constraints on image-based position measurement using the techniques of image registration and achieve repeatable feature measurements on the order of 0.3% of a pixel, or about an order of magnitude improvement on conventional real-world performance. These measurements are then used as inputs for a model optimal, model agnostic, smoothing for calibration of a laser scribe and online tracking of velocimeter using video input. Using appropriate smooth interpolation to increase effective sample density can reduce uncertainty and improve estimates. Use of the proper negative offset of the template function has the result of creating a convolution with higher local curvature than either template of target function which allows improved center-finding. Using the Akaike Information Criterion with a smoothing spline function it is possible to perform a model-optimal smooth on scalar measurements without knowing the underlying model and to determine the function describing the uncertainty in that optimal smooth. An example of empiric derivation of the parameters for a rudimentary Kalman Filter from this is then provided, and tested. Using the techniques of Exploratory Data Analysis and the "Formulize" genetic algorithm tool to convert the spline models into more accessible analytic forms resulted in stable, properly generalized, KF with performance and simplicity that exceeds "textbook" implementations thereof. Validation of the measurement includes that, in analytic case, it led to arbitrary precision in measurement of feature; in reasonable test case using the methods proposed, a reasonable and consistent maximum error of around 0.3% the length of a pixel was achieved and in practice using pixels that were 700nm in size feature position was located to within ± 2 nm. Robust applicability is demonstrated by the measurement of indicator position for a King model 2-32-G-042 rotameter.
ContributorsMunroe, Michael R (Author) / Phelan, Patrick (Thesis advisor) / Kostelich, Eric (Committee member) / Mahalov, Alex (Committee member) / Arizona State University (Publisher)
Created2012
141434-Thumbnail Image.png
Description

Background: Extreme heat is a public health challenge. The scarcity of directly comparable studies on the association of heat with morbidity and mortality and the inconsistent identification of threshold temperatures for severe impacts hampers the development of comprehensive strategies aimed at reducing adverse heat-health events.

Objectives: This quantitative study was designed

Background: Extreme heat is a public health challenge. The scarcity of directly comparable studies on the association of heat with morbidity and mortality and the inconsistent identification of threshold temperatures for severe impacts hampers the development of comprehensive strategies aimed at reducing adverse heat-health events.

Objectives: This quantitative study was designed to link temperature with mortality and morbidity events in Maricopa County, Arizona, USA, with a focus on the summer season.

Methods: Using Poisson regression models that controlled for temporal confounders, we assessed daily temperature–health associations for a suite of mortality and morbidity events, diagnoses, and temperature metrics. Minimum risk temperatures, increasing risk temperatures, and excess risk temperatures were statistically identified to represent different “trigger points” at which heat-health intervention measures might be activated.

Results: We found significant and consistent associations of high environmental temperature with all-cause mortality, cardiovascular mortality, heat-related mortality, and mortality resulting from conditions that are consequences of heat and dehydration. Hospitalizations and emergency department visits due to heat-related conditions and conditions associated with consequences of heat and dehydration were also strongly associated with high temperatures, and there were several times more of those events than there were deaths. For each temperature metric, we observed large contrasts in trigger points (up to 22°C) across multiple health events and diagnoses.

Conclusion: Consideration of multiple health events and diagnoses together with a comprehensive approach to identifying threshold temperatures revealed large differences in trigger points for possible interventions related to heat. Providing an array of heat trigger points applicable for different end-users may improve the public health response to a problem that is projected to worsen in the coming decades.

ContributorsPettiti, Diana B. (Author) / Hondula, David M. (Author) / Yang, Shuo (Author) / Harlan, Sharon L. (Author) / Chowell, Gerardo (Author)
Created2016-02-01
141438-Thumbnail Image.png
Description

Maricopa County, Arizona, anchor to the fastest growing megapolitan area in the United States, is located in a hot desert climate where extreme temperatures are associated with elevated risk of mortality. Continued urbanization in the region will impact atmospheric temperatures and, as a result, potentially affect human health. We aimed

Maricopa County, Arizona, anchor to the fastest growing megapolitan area in the United States, is located in a hot desert climate where extreme temperatures are associated with elevated risk of mortality. Continued urbanization in the region will impact atmospheric temperatures and, as a result, potentially affect human health. We aimed to quantify the number of excess deaths attributable to heat in Maricopa County based on three future urbanization and adaptation scenarios and multiple exposure variables.

Two scenarios (low and high growth projections) represent the maximum possible uncertainty range associated with urbanization in central Arizona, and a third represents the adaptation of high-albedo cool roof technology. Using a Poisson regression model, we related temperature to mortality using data spanning 1983–2007. Regional climate model simulations based on 2050-projected urbanization scenarios for Maricopa County generated distributions of temperature change, and from these predicted changes future excess heat-related mortality was estimated. Subject to urbanization scenario and exposure variable utilized, projections of heat-related mortality ranged from a decrease of 46 deaths per year (− 95%) to an increase of 339 deaths per year (+ 359%).

Projections based on minimum temperature showed the greatest increase for all expansion and adaptation scenarios and were substantially higher than those for daily mean temperature. Projections based on maximum temperature were largely associated with declining mortality. Low-growth and adaptation scenarios led to the smallest increase in predicted heat-related mortality based on mean temperature projections. Use of only one exposure variable to project future heat-related deaths may therefore be misrepresentative in terms of direction of change and magnitude of effects. Because urbanization-induced impacts can vary across the diurnal cycle, projections of heat-related health outcomes that do not consider place-based, time-varying urban heat island effects are neglecting essential elements for policy relevant decision-making.

ContributorsHondula, David M. (Author) / Georgescu, Matei (Author) / Balling, Jr., Robert C. (Author)
Created2014-04-28
141447-Thumbnail Image.png
Description

Preventing heat-associated morbidity and mortality is a public health priority in Maricopa County, Arizona (United States). The objective of this project was to evaluate Maricopa County cooling centers and gain insight into their capacity to provide relief for the public during extreme heat events. During the summer of 2014, 53

Preventing heat-associated morbidity and mortality is a public health priority in Maricopa County, Arizona (United States). The objective of this project was to evaluate Maricopa County cooling centers and gain insight into their capacity to provide relief for the public during extreme heat events. During the summer of 2014, 53 cooling centers were evaluated to assess facility and visitor characteristics. Maricopa County staff collected data by directly observing daily operations and by surveying managers and visitors. The cooling centers in Maricopa County were often housed within community, senior, or religious centers, which offered various services for at least 1500 individuals daily. Many visitors were unemployed and/or homeless. Many learned about a cooling center by word of mouth or by having seen the cooling center’s location. The cooling centers provide a valuable service and reach some of the region’s most vulnerable populations. This project is among the first to systematically evaluate cooling centers from a public health perspective and provides helpful insight to community leaders who are implementing or improving their own network of cooling centers.

ContributorsBerisha, Vjollca (Author) / Hondula, David M. (Author) / Roach, Matthew (Author) / White, Jessica R. (Author) / McKinney, Benita (Author) / Bentz, Darcie (Author) / Mohamed, Ahmed (Author) / Uebelherr, Joshua (Author) / Goodin, Kate (Author)
Created2016-09-23
130393-Thumbnail Image.png
Description
Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative

Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative frequency in the population. The behavioral factors that underpin contact rates are not generally addressed. There is, however, an emerging a class of models that addresses the feedbacks between infectious disease dynamics and the behavioral decisions driving host contact. Referred to as “economic epidemiology” or “epidemiological economics,” the approach explores the determinants of decisions about the number and type of contacts made by individuals, using insights and methods from economics. We show how the approach has the potential both to improve predictions of the course of infectious disease, and to support development of novel approaches to infectious disease management.
Created2015-12-01
130400-Thumbnail Image.png
Description
Preserving a system’s viability in the presence of diversity erosion is critical if the goal is to sustainably support biodiversity. Reduction in population heterogeneity, whether inter- or intraspecies, may increase population fragility, either decreasing its ability to adapt effectively to environmental changes or facilitating the survival and success of ordinarily

Preserving a system’s viability in the presence of diversity erosion is critical if the goal is to sustainably support biodiversity. Reduction in population heterogeneity, whether inter- or intraspecies, may increase population fragility, either decreasing its ability to adapt effectively to environmental changes or facilitating the survival and success of ordinarily rare phenotypes. The latter may result in over-representation of individuals who may participate in resource utilization patterns that can lead to over-exploitation, exhaustion, and, ultimately, collapse of both the resource and the population that depends on it. Here, we aim to identify regimes that can signal whether a consumer–resource system is capable of supporting viable degrees of heterogeneity. The framework used here is an expansion of a previously introduced consumer–resource type system of a population of individuals classified by their resource consumption. Application of the Reduction Theorem to the system enables us to evaluate the health of the system through tracking both the mean value of the parameter of resource (over)consumption, and the population variance, as both change over time. The article concludes with a discussion that highlights applicability of the proposed system to investigation of systems that are affected by particularly devastating overly adapted populations, namely cancerous cells. Potential intervention approaches for system management are discussed in the context of cancer therapies.
Created2015-02-01
130341-Thumbnail Image.png
Description
Background
In the weeks following the first imported case of Ebola in the U. S. on September 29, 2014, coverage of the very limited outbreak dominated the news media, in a manner quite disproportionate to the actual threat to national public health; by the end of October, 2014, there were only

Background
In the weeks following the first imported case of Ebola in the U. S. on September 29, 2014, coverage of the very limited outbreak dominated the news media, in a manner quite disproportionate to the actual threat to national public health; by the end of October, 2014, there were only four laboratory confirmed cases of Ebola in the entire nation. Public interest in these events was high, as reflected in the millions of Ebola-related Internet searches and tweets performed in the month following the first confirmed case. Use of trending Internet searches and tweets has been proposed in the past for real-time prediction of outbreaks (a field referred to as “digital epidemiology”), but accounting for the biases of public panic has been problematic. In the case of the limited U. S. Ebola outbreak, we know that the Ebola-related searches and tweets originating the U. S. during the outbreak were due only to public interest or panic, providing an unprecedented means to determine how these dynamics affect such data, and how news media may be driving these trends.
Methodology
We examine daily Ebola-related Internet search and Twitter data in the U. S. during the six week period ending Oct 31, 2014. TV news coverage data were obtained from the daily number of Ebola-related news videos appearing on two major news networks. We fit the parameters of a mathematical contagion model to the data to determine if the news coverage was a significant factor in the temporal patterns in Ebola-related Internet and Twitter data.
Conclusions
We find significant evidence of contagion, with each Ebola-related news video inspiring tens of thousands of Ebola-related tweets and Internet searches. Between 65% to 76% of the variance in all samples is described by the news media contagion model.
Created2015-06-11
130348-Thumbnail Image.png
Description
Background
Seroepidemiological studies before and after the epidemic wave of H1N1-2009 are useful for estimating population attack rates with a potential to validate early estimates of the reproduction number, R, in modeling studies.
Methodology/Principal Findings
Since the final epidemic size, the proportion of individuals in a population who become infected during an epidemic,

Background
Seroepidemiological studies before and after the epidemic wave of H1N1-2009 are useful for estimating population attack rates with a potential to validate early estimates of the reproduction number, R, in modeling studies.
Methodology/Principal Findings
Since the final epidemic size, the proportion of individuals in a population who become infected during an epidemic, is not the result of a binomial sampling process because infection events are not independent of each other, we propose the use of an asymptotic distribution of the final size to compute approximate 95% confidence intervals of the observed final size. This allows the comparison of the observed final sizes against predictions based on the modeling study (R = 1.15, 1.40 and 1.90), which also yields simple formulae for determining sample sizes for future seroepidemiological studies. We examine a total of eleven published seroepidemiological studies of H1N1-2009 that took place after observing the peak incidence in a number of countries. Observed seropositive proportions in six studies appear to be smaller than that predicted from R = 1.40; four of the six studies sampled serum less than one month after the reported peak incidence. The comparison of the observed final sizes against R = 1.15 and 1.90 reveals that all eleven studies appear not to be significantly deviating from the prediction with R = 1.15, but final sizes in nine studies indicate overestimation if the value R = 1.90 is used.
Conclusions
Sample sizes of published seroepidemiological studies were too small to assess the validity of model predictions except when R = 1.90 was used. We recommend the use of the proposed approach in determining the sample size of post-epidemic seroepidemiological studies, calculating the 95% confidence interval of observed final size, and conducting relevant hypothesis testing instead of the use of methods that rely on a binomial proportion.
Created2011-03-24