Matching Items (167)
151230-Thumbnail Image.png
Description
What can classical chaos do to quantum systems is a fundamental issue highly relevant to a number of branches in physics. The field of quantum chaos has been active for three decades, where the focus was on non-relativistic quantumsystems described by the Schr¨odinger equation. By developing an efficient method to

What can classical chaos do to quantum systems is a fundamental issue highly relevant to a number of branches in physics. The field of quantum chaos has been active for three decades, where the focus was on non-relativistic quantumsystems described by the Schr¨odinger equation. By developing an efficient method to solve the Dirac equation in the setting where relativistic particles can tunnel between two symmetric cavities through a potential barrier, chaotic cavities are found to suppress the spread in the tunneling rate. Tunneling rate for any given energy assumes a wide range that increases with the energy for integrable classical dynamics. However, for chaotic underlying dynamics, the spread is greatly reduced. A remarkable feature, which is a consequence of Klein tunneling, arise only in relativistc quantum systems that substantial tunneling exists even for particle energy approaching zero. Similar results are found in graphene tunneling devices, implying high relevance of relativistic quantum chaos to the development of such devices. Wave propagation through random media occurs in many physical systems, where interesting phenomena such as branched, fracal-like wave patterns can arise. The generic origin of these wave structures is currently a matter of active debate. It is of fundamental interest to develop a minimal, paradigmaticmodel that can generate robust branched wave structures. In so doing, a general observation in all situations where branched structures emerge is non-Gaussian statistics of wave intensity with an algebraic tail in the probability density function. Thus, a universal algebraic wave-intensity distribution becomes the criterion for the validity of any minimal model of branched wave patterns. Coexistence of competing species in spatially extended ecosystems is key to biodiversity in nature. Understanding the dynamical mechanisms of coexistence is a fundamental problem of continuous interest not only in evolutionary biology but also in nonlinear science. A continuous model is proposed for cyclically competing species and the effect of the interplay between the interaction range and mobility on coexistence is investigated. A transition from coexistence to extinction is uncovered with a non-monotonic behavior in the coexistence probability and switches between spiral and plane-wave patterns arise. Strong mobility can either promote or hamper coexistence, while absent in lattice-based models, can be explained in terms of nonlinear partial differential equations.
ContributorsNi, Xuan (Author) / Lai, Ying-Cheng (Thesis advisor) / Huang, Liang (Committee member) / Yu, Hongbin (Committee member) / Akis, Richard (Committee member) / Arizona State University (Publisher)
Created2012
136083-Thumbnail Image.png
Description
Mortality of 1918 influenza virus was high, partly due to bacteria coinfections. We characterize pandemic mortality in Arizona, which had high prevalence of tuberculosis. We applied regressions to over 35,000 data points to estimate the basic reproduction number and excess mortality. Age-specific mortality curves show elevated mortality for all age

Mortality of 1918 influenza virus was high, partly due to bacteria coinfections. We characterize pandemic mortality in Arizona, which had high prevalence of tuberculosis. We applied regressions to over 35,000 data points to estimate the basic reproduction number and excess mortality. Age-specific mortality curves show elevated mortality for all age groups, especially the young, and senior sparing effects. The low value for reproduction number indicates that transmissibility was moderately low.
ContributorsJenner, Melinda Eva (Author) / Chowell-Puente, Gerardo (Thesis director) / Kostelich, Eric (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Life Sciences (Contributor)
Created2015-05
136284-Thumbnail Image.png
Description
Background: While research has quantified the mortality burden of the 1957 H2N2 influenza pandemic in the United States, little is known about how the virus spread locally in Arizona, an area where the dry climate was promoted as reducing respiratory illness transmission yet tuberculosis prevalence was high.
Methods: Using archival

Background: While research has quantified the mortality burden of the 1957 H2N2 influenza pandemic in the United States, little is known about how the virus spread locally in Arizona, an area where the dry climate was promoted as reducing respiratory illness transmission yet tuberculosis prevalence was high.
Methods: Using archival death certificates from 1954 to 1961, this study quantified the age-specific seasonal patterns, excess-mortality rates, and transmissibility patterns of the 1957 pandemic in Maricopa County, Arizona. By applying cyclical Serfling linear regression models to weekly mortality rates, the excess-mortality rates due to respiratory and all-causes were estimated for each age group during the pandemic period. The reproduction number was quantified from weekly data using a simple growth rate method and generation intervals of 3 and 4 days. Local newspaper articles from The Arizona Republic were analyzed from 1957-1958.
Results: Excess-mortality rates varied between waves, age groups, and causes of death, but overall remained low. From October 1959-June 1960, the most severe wave of the pandemic, the absolute excess-mortality rate based on respiratory deaths per 10,000 population was 17.85 in the elderly (≥65 years). All other age groups had extremely low excess-mortality and the typical U-shaped age-pattern was absent. However, relative risk was greatest (3.61) among children and young adolescents (5-14 years) from October 1957-March 1958, based on incidence rates of respiratory deaths. Transmissibility was greatest during the same 1957-1958 period, when the mean reproduction number was 1.08-1.11, assuming 3 or 4 day generation intervals and exponential or fixed distributions.
Conclusions: Maricopa County largely avoided pandemic influenza from 1957-1961. Understanding this historical pandemic and the absence of high excess-mortality rates and transmissibility in Maricopa County may help public health officials prepare for and mitigate future outbreaks of influenza.
ContributorsCobos, April J (Author) / Jehn, Megan (Thesis director) / Chowell-Puente, Gerardo (Committee member) / Barrett, The Honors College (Contributor) / School of Human Evolution and Social Change (Contributor) / School of Life Sciences (Contributor)
Created2015-05
130393-Thumbnail Image.png
Description
Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative

Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative frequency in the population. The behavioral factors that underpin contact rates are not generally addressed. There is, however, an emerging a class of models that addresses the feedbacks between infectious disease dynamics and the behavioral decisions driving host contact. Referred to as “economic epidemiology” or “epidemiological economics,” the approach explores the determinants of decisions about the number and type of contacts made by individuals, using insights and methods from economics. We show how the approach has the potential both to improve predictions of the course of infectious disease, and to support development of novel approaches to infectious disease management.
Created2015-12-01
130348-Thumbnail Image.png
Description
Background
Seroepidemiological studies before and after the epidemic wave of H1N1-2009 are useful for estimating population attack rates with a potential to validate early estimates of the reproduction number, R, in modeling studies.
Methodology/Principal Findings
Since the final epidemic size, the proportion of individuals in a population who become infected during an epidemic,

Background
Seroepidemiological studies before and after the epidemic wave of H1N1-2009 are useful for estimating population attack rates with a potential to validate early estimates of the reproduction number, R, in modeling studies.
Methodology/Principal Findings
Since the final epidemic size, the proportion of individuals in a population who become infected during an epidemic, is not the result of a binomial sampling process because infection events are not independent of each other, we propose the use of an asymptotic distribution of the final size to compute approximate 95% confidence intervals of the observed final size. This allows the comparison of the observed final sizes against predictions based on the modeling study (R = 1.15, 1.40 and 1.90), which also yields simple formulae for determining sample sizes for future seroepidemiological studies. We examine a total of eleven published seroepidemiological studies of H1N1-2009 that took place after observing the peak incidence in a number of countries. Observed seropositive proportions in six studies appear to be smaller than that predicted from R = 1.40; four of the six studies sampled serum less than one month after the reported peak incidence. The comparison of the observed final sizes against R = 1.15 and 1.90 reveals that all eleven studies appear not to be significantly deviating from the prediction with R = 1.15, but final sizes in nine studies indicate overestimation if the value R = 1.90 is used.
Conclusions
Sample sizes of published seroepidemiological studies were too small to assess the validity of model predictions except when R = 1.90 was used. We recommend the use of the proposed approach in determining the sample size of post-epidemic seroepidemiological studies, calculating the 95% confidence interval of observed final size, and conducting relevant hypothesis testing instead of the use of methods that rely on a binomial proportion.
Created2011-03-24
132010-Thumbnail Image.png
Description
Complex human controls is a topic of much interest in the fields of robotics, manufacturing, space exploration and many others. Even simple tasks that humans perform with ease can be extremely complicated when observed from a controls and complex systems perspective. One such simple task is that of a human

Complex human controls is a topic of much interest in the fields of robotics, manufacturing, space exploration and many others. Even simple tasks that humans perform with ease can be extremely complicated when observed from a controls and complex systems perspective. One such simple task is that of a human carrying and moving a coffee cup. Though this may be a mundane task for humans, when this task is modelled and analyzed, the system may be quite chaotic in nature. Understanding such systems is key to the development robots and autonomous systems that can perform these tasks themselves.

The coffee cup system can be simplified and modeled by a cart-and-pendulum system. Bazzi et al. and Maurice et al. present two different cart-and-pendulum systems to represent the coffee cup system [1],[2]. The purpose of this project was to build upon these systems and to gain a better understanding of the coffee cup system and to determine where chaos existed within the system. The honors thesis team first worked with their senior design group to develop a mathematical model for the cart-and-pendulum system based on the Bazzi and Maurice papers [1],[2]. This system was analyzed and then built upon by the honors thesis team to build a cart-and-two-pendulum model to represent the coffee cup system more accurately.

Analysis of the single pendulum model showed that there exists a low frequency region where the pendulum and the cart remain in phase with each other and a high frequency region where the cart and pendulum have a π phase difference between them. The transition point of the low and high frequency region is determined by the resonant frequency of the pendulum. The analysis of the two-pendulum system also confirmed this result and revealed that differences in length between the pendulum cause the pendulums to transition to the high frequency regions at separate frequency. The pendulums have different resonance frequencies and transition into the high frequency region based on their own resonant frequency. This causes a range of frequencies where the pendulums are out of phase from each other. After both pendulums have transitioned, they remain in phase with each other and out of phase from the cart.

However, if the length of the pendulum is decreased too much, the system starts to exhibit chaotic behavior. The short pendulum starts to act in a chaotic manner and the phase relationship between the pendulums and the carts is no longer maintained. Since the pendulum length represents the distance between the particle of coffee and the top of the cup, this implies that coffee near the top of the cup would cause the system to act chaotically. Further analysis would be needed to determine the reason why the length affects the system in this way.
ContributorsZindani, Abdul Rahman (Co-author) / Crane, Kari (Co-author) / Lai, Ying-Cheng (Thesis director) / Jiang, Junjie (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2019-12
134706-Thumbnail Image.png
Description
Open source image analytics and data mining software are widely available but can be overly-complicated and non-intuitive for medical physicians and researchers to use. The ASU-Mayo Clinic Imaging Informatics Lab has developed an in-house pipeline to process medical images, extract imaging features, and develop multi-parametric models to assist disease staging

Open source image analytics and data mining software are widely available but can be overly-complicated and non-intuitive for medical physicians and researchers to use. The ASU-Mayo Clinic Imaging Informatics Lab has developed an in-house pipeline to process medical images, extract imaging features, and develop multi-parametric models to assist disease staging and diagnosis. The tools have been extensively used in a number of medical studies including brain tumor, breast cancer, liver cancer, Alzheimer's disease, and migraine. Recognizing the need from users in the medical field for a simplified interface and streamlined functionalities, this project aims to democratize this pipeline so that it is more readily available to health practitioners and third party developers.
ContributorsBaer, Lisa Zhou (Author) / Wu, Teresa (Thesis director) / Wang, Yalin (Committee member) / Computer Science and Engineering Program (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
152126-Thumbnail Image.png
Description
Video object segmentation (VOS) is an important task in computer vision with a lot of applications, e.g., video editing, object tracking, and object based encoding. Different from image object segmentation, video object segmentation must consider both spatial and temporal coherence for the object. Despite extensive previous work, the problem is

Video object segmentation (VOS) is an important task in computer vision with a lot of applications, e.g., video editing, object tracking, and object based encoding. Different from image object segmentation, video object segmentation must consider both spatial and temporal coherence for the object. Despite extensive previous work, the problem is still challenging. Usually, foreground object in the video draws more attention from humans, i.e. it is salient. In this thesis we tackle the problem from the aspect of saliency, where saliency means a certain subset of visual information selected by a visual system (human or machine). We present a novel unsupervised method for video object segmentation that considers both low level vision cues and high level motion cues. In our model, video object segmentation can be formulated as a unified energy minimization problem and solved in polynomial time by employing the min-cut algorithm. Specifically, our energy function comprises the unary term and pair-wise interaction energy term respectively, where unary term measures region saliency and interaction term smooths the mutual effects between object saliency and motion saliency. Object saliency is computed in spatial domain from each discrete frame using multi-scale context features, e.g., color histogram, gradient, and graph based manifold ranking. Meanwhile, motion saliency is calculated in temporal domain by extracting phase information of the video. In the experimental section of this thesis, our proposed method has been evaluated on several benchmark datasets. In MSRA 1000 dataset the result demonstrates that our spatial object saliency detection is superior to the state-of-art methods. Moreover, our temporal motion saliency detector can achieve better performance than existing motion detection approaches in UCF sports action analysis dataset and Weizmann dataset respectively. Finally, we show the attractive empirical result and quantitative evaluation of our approach on two benchmark video object segmentation datasets.
ContributorsWang, Yilin (Author) / Li, Baoxin (Thesis advisor) / Wang, Yalin (Committee member) / Cleveau, David (Committee member) / Arizona State University (Publisher)
Created2013
152128-Thumbnail Image.png
Description
Learning from high dimensional biomedical data attracts lots of attention recently. High dimensional biomedical data often suffer from the curse of dimensionality and have imbalanced class distributions. Both of these features of biomedical data, high dimensionality and imbalanced class distributions, are challenging for traditional machine learning methods and may affect

Learning from high dimensional biomedical data attracts lots of attention recently. High dimensional biomedical data often suffer from the curse of dimensionality and have imbalanced class distributions. Both of these features of biomedical data, high dimensionality and imbalanced class distributions, are challenging for traditional machine learning methods and may affect the model performance. In this thesis, I focus on developing learning methods for the high-dimensional imbalanced biomedical data. In the first part, a sparse canonical correlation analysis (CCA) method is presented. The penalty terms is used to control the sparsity of the projection matrices of CCA. The sparse CCA method is then applied to find patterns among biomedical data sets and labels, or to find patterns among different data sources. In the second part, I discuss several learning problems for imbalanced biomedical data. Note that traditional learning systems are often biased when the biomedical data are imbalanced. Therefore, traditional evaluations such as accuracy may be inappropriate for such cases. I then discuss several alternative evaluation criteria to evaluate the learning performance. For imbalanced binary classification problems, I use the undersampling based classifiers ensemble (UEM) strategy to obtain accurate models for both classes of samples. A small sphere and large margin (SSLM) approach is also presented to detect rare abnormal samples from a large number of subjects. In addition, I apply multiple feature selection and clustering methods to deal with high-dimensional data and data with highly correlated features. Experiments on high-dimensional imbalanced biomedical data are presented which illustrate the effectiveness and efficiency of my methods.
ContributorsYang, Tao (Author) / Ye, Jieping (Thesis advisor) / Wang, Yalin (Committee member) / Davulcu, Hasan (Committee member) / Arizona State University (Publisher)
Created2013
171764-Thumbnail Image.png
Description
This dissertation constructs a new computational processing framework to robustly and precisely quantify retinotopic maps based on their angle distortion properties. More generally, this framework solves the problem of how to robustly and precisely quantify (angle) distortions of noisy or incomplete (boundary enclosed) 2-dimensional surface to surface mappings. This framework

This dissertation constructs a new computational processing framework to robustly and precisely quantify retinotopic maps based on their angle distortion properties. More generally, this framework solves the problem of how to robustly and precisely quantify (angle) distortions of noisy or incomplete (boundary enclosed) 2-dimensional surface to surface mappings. This framework builds upon the Beltrami Coefficient (BC) description of quasiconformal mappings that directly quantifies local mapping (circles to ellipses) distortions between diffeomorphisms of boundary enclosed plane domains homeomorphic to the unit disk. A new map called the Beltrami Coefficient Map (BCM) was constructed to describe distortions in retinotopic maps. The BCM can be used to fully reconstruct the original target surface (retinal visual field) of retinotopic maps. This dissertation also compared retinotopic maps in the visual processing cascade, which is a series of connected retinotopic maps responsible for visual data processing of physical images captured by the eyes. By comparing the BCM results from a large Human Connectome project (HCP) retinotopic dataset (N=181), a new computational quasiconformal mapping description of the transformed retinal image as it passes through the cascade is proposed, which is not present in any current literature. The description applied on HCP data provided direct visible and quantifiable geometric properties of the cascade in a way that has not been observed before. Because retinotopic maps are generated from in vivo noisy functional magnetic resonance imaging (fMRI), quantifying them comes with a certain degree of uncertainty. To quantify the uncertainties in the quantification results, it is necessary to generate statistical models of retinotopic maps from their BCMs and raw fMRI signals. Considering that estimating retinotopic maps from real noisy fMRI time series data using the population receptive field (pRF) model is a time consuming process, a convolutional neural network (CNN) was constructed and trained to predict pRF model parameters from real noisy fMRI data
ContributorsTa, Duyan Nguyen (Author) / Wang, Yalin (Thesis advisor) / Lu, Zhong-Lin (Committee member) / Hansford, Dianne (Committee member) / Liu, Huan (Committee member) / Li, Baoxin (Committee member) / Arizona State University (Publisher)
Created2022