Matching Items (71)
149730-Thumbnail Image.png
Description
Nonlinear dispersive equations model nonlinear waves in a wide range of physical and mathematics contexts. They reinforce or dissipate effects of linear dispersion and nonlinear interactions, and thus, may be of a focusing or defocusing nature. The nonlinear Schrödinger equation or NLS is an example of such equations. It appears

Nonlinear dispersive equations model nonlinear waves in a wide range of physical and mathematics contexts. They reinforce or dissipate effects of linear dispersion and nonlinear interactions, and thus, may be of a focusing or defocusing nature. The nonlinear Schrödinger equation or NLS is an example of such equations. It appears as a model in hydrodynamics, nonlinear optics, quantum condensates, heat pulses in solids and various other nonlinear instability phenomena. In mathematics, one of the interests is to look at the wave interaction: waves propagation with different speeds and/or different directions produces either small perturbations comparable with linear behavior, or creates solitary waves, or even leads to singular solutions. This dissertation studies the global behavior of finite energy solutions to the $d$-dimensional focusing NLS equation, $i partial _t u+Delta u+ |u|^{p-1}u=0, $ with initial data $u_0in H^1,; x in Rn$; the nonlinearity power $p$ and the dimension $d$ are chosen so that the scaling index $s=frac{d}{2}-frac{2}{p-1}$ is between 0 and 1, thus, the NLS is mass-supercritical $(s>0)$ and energy-subcritical $(s<1).$ For solutions with $ME[u_0]<1$ ($ME[u_0]$ stands for an invariant and conserved quantity in terms of the mass and energy of $u_0$), a sharp threshold for scattering and blowup is given. Namely, if the renormalized gradient $g_u$ of a solution $u$ to NLS is initially less than 1, i.e., $g_u(0)<1,$ then the solution exists globally in time and scatters in $H^1$ (approaches some linear Schr"odinger evolution as $ttopminfty$); if the renormalized gradient $g_u(0)>1,$ then the solution exhibits a blowup behavior, that is, either a finite time blowup occurs, or there is a divergence of $H^1$ norm in infinite time. This work generalizes the results for the 3d cubic NLS obtained in a series of papers by Holmer-Roudenko and Duyckaerts-Holmer-Roudenko with the key ingredients, the concentration compactness and localized variance, developed in the context of the energy-critical NLS and Nonlinear Wave equations by Kenig and Merle. One of the difficulties is fractional powers of nonlinearities which are overcome by considering Besov-Strichartz estimates and various fractional differentiation rules.
ContributorsGuevara, Cristi Darley (Author) / Roudenko, Svetlana (Thesis advisor) / Castillo_Chavez, Carlos (Committee member) / Jones, Donald (Committee member) / Mahalov, Alex (Committee member) / Suslov, Sergei (Committee member) / Arizona State University (Publisher)
Created2011
151515-Thumbnail Image.png
Description
This thesis outlines the development of a vector retrieval technique, based on data assimilation, for a coherent Doppler LIDAR (Light Detection and Ranging). A detailed analysis of the Optimal Interpolation (OI) technique for vector retrieval is presented. Through several modifications to the OI technique, it is shown that the modified

This thesis outlines the development of a vector retrieval technique, based on data assimilation, for a coherent Doppler LIDAR (Light Detection and Ranging). A detailed analysis of the Optimal Interpolation (OI) technique for vector retrieval is presented. Through several modifications to the OI technique, it is shown that the modified technique results in significant improvement in velocity retrieval accuracy. These modifications include changes to innovation covariance portioning, covariance binning, and analysis increment calculation. It is observed that the modified technique is able to make retrievals with better accuracy, preserves local information better, and compares well with tower measurements. In order to study the error of representativeness and vector retrieval error, a lidar simulator was constructed. Using the lidar simulator a thorough sensitivity analysis of the lidar measurement process and vector retrieval is carried out. The error of representativeness as a function of scales of motion and sensitivity of vector retrieval to look angle is quantified. Using the modified OI technique, study of nocturnal flow in Owens' Valley, CA was carried out to identify and understand uncharacteristic events on the night of March 27th 2006. Observations from 1030 UTC to 1230 UTC (0230 hr local time to 0430 hr local time) on March 27 2006 are presented. Lidar observations show complex and uncharacteristic flows such as sudden bursts of westerly cross-valley wind mixing with the dominant up-valley wind. Model results from Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS®) and other in-situ instrumentations are used to corroborate and complement these observations. The modified OI technique is used to identify uncharacteristic and extreme flow events at a wind development site. Estimates of turbulence and shear from this technique are compared to tower measurements. A formulation for equivalent wind speed in the presence of variations in wind speed and direction, combined with shear is developed and used to determine wind energy content in presence of turbulence.
ContributorsChoukulkar, Aditya (Author) / Calhoun, Ronald (Thesis advisor) / Mahalov, Alex (Committee member) / Kostelich, Eric (Committee member) / Huang, Huei-Ping (Committee member) / Phelan, Patrick (Committee member) / Arizona State University (Publisher)
Created2013
Description
It is possible in a properly controlled environment, such as industrial metrology, to make significant headway into the non-industrial constraints on image-based position measurement using the techniques of image registration and achieve repeatable feature measurements on the order of 0.3% of a pixel, or about an order of magnitude improvement

It is possible in a properly controlled environment, such as industrial metrology, to make significant headway into the non-industrial constraints on image-based position measurement using the techniques of image registration and achieve repeatable feature measurements on the order of 0.3% of a pixel, or about an order of magnitude improvement on conventional real-world performance. These measurements are then used as inputs for a model optimal, model agnostic, smoothing for calibration of a laser scribe and online tracking of velocimeter using video input. Using appropriate smooth interpolation to increase effective sample density can reduce uncertainty and improve estimates. Use of the proper negative offset of the template function has the result of creating a convolution with higher local curvature than either template of target function which allows improved center-finding. Using the Akaike Information Criterion with a smoothing spline function it is possible to perform a model-optimal smooth on scalar measurements without knowing the underlying model and to determine the function describing the uncertainty in that optimal smooth. An example of empiric derivation of the parameters for a rudimentary Kalman Filter from this is then provided, and tested. Using the techniques of Exploratory Data Analysis and the "Formulize" genetic algorithm tool to convert the spline models into more accessible analytic forms resulted in stable, properly generalized, KF with performance and simplicity that exceeds "textbook" implementations thereof. Validation of the measurement includes that, in analytic case, it led to arbitrary precision in measurement of feature; in reasonable test case using the methods proposed, a reasonable and consistent maximum error of around 0.3% the length of a pixel was achieved and in practice using pixels that were 700nm in size feature position was located to within ± 2 nm. Robust applicability is demonstrated by the measurement of indicator position for a King model 2-32-G-042 rotameter.
ContributorsMunroe, Michael R (Author) / Phelan, Patrick (Thesis advisor) / Kostelich, Eric (Committee member) / Mahalov, Alex (Committee member) / Arizona State University (Publisher)
Created2012
137818-Thumbnail Image.png
Description
This thesis shows analyses of mixing and transport patterns associated with Hurricane Katrina as it hit the United States in August of 2005. Specifically, by applying atmospheric velocity information from the Weather Research and Forecasting System, finite-time Lyapunov exponents have been computed and the Lagrangian Coherent Structures have been identified.

This thesis shows analyses of mixing and transport patterns associated with Hurricane Katrina as it hit the United States in August of 2005. Specifically, by applying atmospheric velocity information from the Weather Research and Forecasting System, finite-time Lyapunov exponents have been computed and the Lagrangian Coherent Structures have been identified. The chaotic dynamics of material transport induced by the hurricane are results from these structures within the flow. Boundaries of the coherent structures are highlighted by the FTLE field. Individual particle transport within the hurricane is affected by the location of these boundaries. In addition to idealized fluid particles, we also studied inertial particles which have finite size and inertia. Basing on established Maxey-Riley equations of the dynamics of particles of finite size, we obtain a reduced equation governing the position process. Using methods derived from computer graphics, we identify maximizers of the FTLE field. Following and applying these ideas, we analyze the dynamics of inertial particle transport within Hurricane Katrina, through comparison of trajectories of dierent sized particles and by pinpointing the location of the Lagrangian Coherent Structures.
ContributorsWake, Christian (Author) / Tang, Wenbo (Thesis director) / Moustaoui, Mohamed (Committee member) / Kostelich, Eric (Committee member) / Barrett, The Honors College (Contributor) / College of Liberal Arts and Sciences (Contributor)
Created2012-12
136845-Thumbnail Image.png
Description
The goal of this project was to examine the separatricies that define regions of distinct flow behaviors in realistic time-dependent dynamical systems. In particular, we adapted previously available methods for computing the Finite-Time Lyapunov Exponent (FTLE) to a set of measured wind velocity data in order to visualize the separatricies

The goal of this project was to examine the separatricies that define regions of distinct flow behaviors in realistic time-dependent dynamical systems. In particular, we adapted previously available methods for computing the Finite-Time Lyapunov Exponent (FTLE) to a set of measured wind velocity data in order to visualize the separatricies as ridges of the FTLE field in a section of the atmosphere. This visualization required a number of alterations to the original methods, including interpolation techniques and two different adaptive refinement schemes for producing more detailed results. Overall, there were two computations performed with the wind velocity data: once along a single spherical surface, on which the separatricies could be visualized as material lines, and then along a three-dimensional section of the atmosphere, for which the separatricies were material surfaces. The resulting figures provide an image of the Antarctic polar vortex from the wind velocity data, which is consistent with other data gathered on the same date.
ContributorsUpton, James Thomas (Author) / Tang, Wenbo (Thesis director) / Moustaoui, Mohamed (Committee member) / Barrett, The Honors College (Contributor) / School of International Letters and Cultures (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2014-05
137507-Thumbnail Image.png
Description
Using weather data from the Weather Research and Forecasting model (WRF), we analyze the transport of inertial particles in Hurricane Katrina in order to identify coherent patterns of motion. For our analysis, we choose a Lagrangian approach instead of an Eulerian approach because the Lagrangian approach is objective and frame-independent,

Using weather data from the Weather Research and Forecasting model (WRF), we analyze the transport of inertial particles in Hurricane Katrina in order to identify coherent patterns of motion. For our analysis, we choose a Lagrangian approach instead of an Eulerian approach because the Lagrangian approach is objective and frame-independent, and gives results which are better defined. In particular, we locate Lagrangian Coherent Structures (LCS), which are smooth sets of fluid particles which are locally most hyperbolic (either attracting or repelling). We implement a variational method for locating LCS and compare the results to previous computation of LCS using Finite-Time Lyapunov Exponents (FTLE) to identify regions of high stretching in the fluid flow.
ContributorsDeibel, Angelica Rae (Author) / Tang, Wenbo (Thesis director) / Moustaoui, Mohamed (Committee member) / Kostelich, Eric (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2013-05
130393-Thumbnail Image.png
Description
Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative

Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative frequency in the population. The behavioral factors that underpin contact rates are not generally addressed. There is, however, an emerging a class of models that addresses the feedbacks between infectious disease dynamics and the behavioral decisions driving host contact. Referred to as “economic epidemiology” or “epidemiological economics,” the approach explores the determinants of decisions about the number and type of contacts made by individuals, using insights and methods from economics. We show how the approach has the potential both to improve predictions of the course of infectious disease, and to support development of novel approaches to infectious disease management.
Created2015-12-01
130400-Thumbnail Image.png
Description
Preserving a system’s viability in the presence of diversity erosion is critical if the goal is to sustainably support biodiversity. Reduction in population heterogeneity, whether inter- or intraspecies, may increase population fragility, either decreasing its ability to adapt effectively to environmental changes or facilitating the survival and success of ordinarily

Preserving a system’s viability in the presence of diversity erosion is critical if the goal is to sustainably support biodiversity. Reduction in population heterogeneity, whether inter- or intraspecies, may increase population fragility, either decreasing its ability to adapt effectively to environmental changes or facilitating the survival and success of ordinarily rare phenotypes. The latter may result in over-representation of individuals who may participate in resource utilization patterns that can lead to over-exploitation, exhaustion, and, ultimately, collapse of both the resource and the population that depends on it. Here, we aim to identify regimes that can signal whether a consumer–resource system is capable of supporting viable degrees of heterogeneity. The framework used here is an expansion of a previously introduced consumer–resource type system of a population of individuals classified by their resource consumption. Application of the Reduction Theorem to the system enables us to evaluate the health of the system through tracking both the mean value of the parameter of resource (over)consumption, and the population variance, as both change over time. The article concludes with a discussion that highlights applicability of the proposed system to investigation of systems that are affected by particularly devastating overly adapted populations, namely cancerous cells. Potential intervention approaches for system management are discussed in the context of cancer therapies.
Created2015-02-01
130341-Thumbnail Image.png
Description
Background
In the weeks following the first imported case of Ebola in the U. S. on September 29, 2014, coverage of the very limited outbreak dominated the news media, in a manner quite disproportionate to the actual threat to national public health; by the end of October, 2014, there were only

Background
In the weeks following the first imported case of Ebola in the U. S. on September 29, 2014, coverage of the very limited outbreak dominated the news media, in a manner quite disproportionate to the actual threat to national public health; by the end of October, 2014, there were only four laboratory confirmed cases of Ebola in the entire nation. Public interest in these events was high, as reflected in the millions of Ebola-related Internet searches and tweets performed in the month following the first confirmed case. Use of trending Internet searches and tweets has been proposed in the past for real-time prediction of outbreaks (a field referred to as “digital epidemiology”), but accounting for the biases of public panic has been problematic. In the case of the limited U. S. Ebola outbreak, we know that the Ebola-related searches and tweets originating the U. S. during the outbreak were due only to public interest or panic, providing an unprecedented means to determine how these dynamics affect such data, and how news media may be driving these trends.
Methodology
We examine daily Ebola-related Internet search and Twitter data in the U. S. during the six week period ending Oct 31, 2014. TV news coverage data were obtained from the daily number of Ebola-related news videos appearing on two major news networks. We fit the parameters of a mathematical contagion model to the data to determine if the news coverage was a significant factor in the temporal patterns in Ebola-related Internet and Twitter data.
Conclusions
We find significant evidence of contagion, with each Ebola-related news video inspiring tens of thousands of Ebola-related tweets and Internet searches. Between 65% to 76% of the variance in all samples is described by the news media contagion model.
Created2015-06-11
130348-Thumbnail Image.png
Description
Background
Seroepidemiological studies before and after the epidemic wave of H1N1-2009 are useful for estimating population attack rates with a potential to validate early estimates of the reproduction number, R, in modeling studies.
Methodology/Principal Findings
Since the final epidemic size, the proportion of individuals in a population who become infected during an epidemic,

Background
Seroepidemiological studies before and after the epidemic wave of H1N1-2009 are useful for estimating population attack rates with a potential to validate early estimates of the reproduction number, R, in modeling studies.
Methodology/Principal Findings
Since the final epidemic size, the proportion of individuals in a population who become infected during an epidemic, is not the result of a binomial sampling process because infection events are not independent of each other, we propose the use of an asymptotic distribution of the final size to compute approximate 95% confidence intervals of the observed final size. This allows the comparison of the observed final sizes against predictions based on the modeling study (R = 1.15, 1.40 and 1.90), which also yields simple formulae for determining sample sizes for future seroepidemiological studies. We examine a total of eleven published seroepidemiological studies of H1N1-2009 that took place after observing the peak incidence in a number of countries. Observed seropositive proportions in six studies appear to be smaller than that predicted from R = 1.40; four of the six studies sampled serum less than one month after the reported peak incidence. The comparison of the observed final sizes against R = 1.15 and 1.90 reveals that all eleven studies appear not to be significantly deviating from the prediction with R = 1.15, but final sizes in nine studies indicate overestimation if the value R = 1.90 is used.
Conclusions
Sample sizes of published seroepidemiological studies were too small to assess the validity of model predictions except when R = 1.90 was used. We recommend the use of the proposed approach in determining the sample size of post-epidemic seroepidemiological studies, calculating the 95% confidence interval of observed final size, and conducting relevant hypothesis testing instead of the use of methods that rely on a binomial proportion.
Created2011-03-24