Matching Items (122)
171585-Thumbnail Image.png
Description
I present results of field and laboratory experiments investigating the habitability of one of Earth’s driest environments: the Atacama Desert. This Desert, along the west coast of South America spanning Perú and Chile, is one of the driest places on Earth and has been exceedingly arid for millions of years.

I present results of field and laboratory experiments investigating the habitability of one of Earth’s driest environments: the Atacama Desert. This Desert, along the west coast of South America spanning Perú and Chile, is one of the driest places on Earth and has been exceedingly arid for millions of years. These conditions create the perfect natural laboratory for assessing life at the extremes of habitability. All known life needs water; however, the extraordinarily dry Atacama Desert is inhabited by well-adapted microorganisms capable of colonizing this hostile environment. I show field and laboratory evidence of an environmental process, water vapor adsorption, that provides a daily, sustainable input of water into the near (3 - 5 cm) subsurface through water vapor-soil particle interactions. I estimate that this water input may rival the yearly average input of rain in these soils (~2 mm). I also demonstrate, for the first time, that water vapor adsorption is dependent on mineral composition via a series of laboratory water vapor adsorption experiments. The results of these experiments provide evidence that mineral composition, and ultimately soil composition, measurably and significantly affect the equilibrium soil water content. This suggests that soil microbial communities may be extremely heterogeneous in distribution depending on the distribution of adsorbent minerals. Finally, I present changes in biologically relevant gasses (i.e., H2, CH4, CO, and CO2) over long-duration incubation experiments designed to assess the potential for biological activity in soils collected from a hyperarid region in the Atacama Desert. These long-duration experiments mimicked typical water availability conditions in the Atacama Desert; in other words, the incubations were performed without condensed water addition. The results suggest a potential for methane-production in the live experiments relative to the sterile controls, and thus, for biological activity in hyperarid soils. However, due to the extremely low biomass and extremely low rates of activity in these soils, the methods employed here were unable to provide robust evidence for activity. Overall, the hyperarid regions of the Atacama Desert are an important resource for researchers by providing a window into the environmental dynamics and subsequent microbial responses near the limit of habitability.
ContributorsGlaser, Donald M (Author) / Hartnett, Hilairy E (Thesis advisor) / Anbar, Ariel (Committee member) / Shock, Everett (Committee member) / Arizona State University (Publisher)
Created2022
171979-Thumbnail Image.png
Description
Neural tissue is a delicate system comprised of neurons and their synapses, glial cells for support, and vasculature for oxygen and nutrient delivery. This complexity ultimately gives rise to the human brain, a system researchers have become increasingly interested in replicating for artificial intelligence purposes. Some have even gone so

Neural tissue is a delicate system comprised of neurons and their synapses, glial cells for support, and vasculature for oxygen and nutrient delivery. This complexity ultimately gives rise to the human brain, a system researchers have become increasingly interested in replicating for artificial intelligence purposes. Some have even gone so far as to use neuronal cultures as computing hardware, but utilizing an environment closer to a living brain means having to grapple with the same issues faced by clinicians and researchers trying to treat brain disorders. Most outstanding among these are the problems that arise with invasive interfaces. Optical techniques that use fluorescent dyes and proteins have emerged as a solution for noninvasive imaging with single-cell resolution in vitro and in vivo, but feeding in information in the form of neuromodulation still requires implanted electrodes. The implantation process of these electrodes damages nearby neurons and their connections, causes hemorrhaging, and leads to scarring and gliosis that diminish efficacy. Here, a new approach for noninvasive neuromodulation with high spatial precision is described. It makes use of a combination of ultrasound, high frequency acoustic energy that can be focused to submillimeter regions at significant depths, and electric fields, an effective tool for neuromodulation that lacks spatial precision when used in a noninvasive manner. The hypothesis is that, when combined in a specific manner, these will lead to nonlinear effects at neuronal membranes that cause cells only in the region of overlap to be stimulated. Computational modeling confirmed this combination to be uniquely stimulating, contingent on certain physical effects of ultrasound on cell membranes. Subsequent in vitro experiments led to inconclusive results, however, leaving the door open for future experimentation with modified configurations and approaches. The specific combination explored here is also not the only untested technique that may achieve a similar goal.
ContributorsNester, Elliot (Author) / Wang, Yalin (Thesis advisor) / Muthuswamy, Jitendran (Committee member) / Towe, Bruce (Committee member) / Arizona State University (Publisher)
Created2022
171902-Thumbnail Image.png
Description
Beta-Amyloid(Aβ) plaques and tau protein tangles in the brain are now widely recognized as the defining hallmarks of Alzheimer’s disease (AD), followed by structural atrophy detectable on brain magnetic resonance imaging (MRI) scans. However, current methods to detect Aβ/tau pathology are either invasive (lumbar puncture) or quite costly and not

Beta-Amyloid(Aβ) plaques and tau protein tangles in the brain are now widely recognized as the defining hallmarks of Alzheimer’s disease (AD), followed by structural atrophy detectable on brain magnetic resonance imaging (MRI) scans. However, current methods to detect Aβ/tau pathology are either invasive (lumbar puncture) or quite costly and not widely available (positron emission tomography (PET)). And one of the particular neurodegenerative regions is the hippocampus to which the influence of Aβ/tau on has been one of the research projects focuses in the AD pathophysiological progress. In this dissertation, I proposed three novel machine learning and statistical models to examine subtle aspects of the hippocampal morphometry from MRI that are associated with Aβ /tau burden in the brain, measured using PET images. The first model is a novel unsupervised feature reduction model to generate a low-dimensional representation of hippocampal morphometry for each individual subject, which has superior performance in predicting Aβ/tau burden in the brain. The second one is an efficient federated group lasso model to identify the hippocampal subregions where atrophy is strongly associated with abnormal Aβ/Tau. The last one is a federated model for imaging genetics, which can identify genetic and transcriptomic influences on hippocampal morphometry. Finally, I stated the results of these three models that have been published or submitted to peer-reviewed conferences and journals.
ContributorsWu, Jianfeng (Author) / Wang, Yalin (Thesis advisor) / Li, Baoxin (Committee member) / Liang, Jianming (Committee member) / Wang, Junwen (Committee member) / Wu, Teresa (Committee member) / Arizona State University (Publisher)
Created2022
189274-Thumbnail Image.png
Description
Structural Magnetic Resonance Imaging analysis is a vital component in the study of Alzheimer’s Disease pathology and several techniques exist as part of the existing research conducted. In particular, volumetric approaches in this field are known to be beneficial due to the increased capability to express morphological characteristics when compared

Structural Magnetic Resonance Imaging analysis is a vital component in the study of Alzheimer’s Disease pathology and several techniques exist as part of the existing research conducted. In particular, volumetric approaches in this field are known to be beneficial due to the increased capability to express morphological characteristics when compared to manifold methods. To aid in the improvement of the field, this paper aims to propose an intrinsic volumetric conic system that can be applied to bounded volumetric meshes to enable a more effective study of subjects. The computation of the metric involves the use of heat kernel theory and conformal parameterization on genus-0 surfaces extended to a volumetric domain. Additionally, this paper also explores the use of the ’TetCNN’ architecture on the classification of hippocampal tetrahedral meshes to detect features that correspond to Alzheimer’s indicators. The model tested was able to achieve remarkable results with a measured classification accuracy of above 90% in the task of differentiating between subjects diagnosed with Alzheimer’s and normal control subjects.
ContributorsGeorge, John Varghese (Author) / Wang, Yalin (Thesis advisor) / Hansford, Dianne (Committee member) / Gupta, Vikash (Committee member) / Arizona State University (Publisher)
Created2023
190909-Thumbnail Image.png
Description
Dissolved inorganic carbon (DIC) and dissolved organic carbon (DOC) are crucial nutrients for autotrophic and heterotrophic microbial life, respectively, in hydrothermal systems. Biogeochemical processes that control amounts of DIC and DOC in Yellowstone hot springs can be investigated by measuring carbon abundances and respective isotopic values. A decade and a

Dissolved inorganic carbon (DIC) and dissolved organic carbon (DOC) are crucial nutrients for autotrophic and heterotrophic microbial life, respectively, in hydrothermal systems. Biogeochemical processes that control amounts of DIC and DOC in Yellowstone hot springs can be investigated by measuring carbon abundances and respective isotopic values. A decade and a half of field work in 10 regions within Yellowstone National Park and subsequent geochemical lab analyses reveal that sulfate-dominant acidic regions have high DOC (Up to 57 ppm C) and lower DIC (up to 50 ppm C) compared to neutral-chloride regions with low DOC (< 2 ppm C) and higher DIC (up to 100 ppm C). Abundances and isotopic data suggest that sedimentary rock erosion by acidic hydrothermal fluids, fresh snow-derived meteoric water, and exogenous carbon input allowed by local topography may affect DOC levels. Evaluating the isotopic compositions of DIC and DOC in hydrothermal fluids gives insight on the geology and microbial life in the subsurface between different regions. DIC δ13C values range from -4‰ to +5‰ at pH 5-9 and from -10‰ to +3‰ at pH 2-5 with several springs lower than -10‰. DOC δ13C values parkwide range from -10‰ to -30‰. Within this range, neutral-chloride regions in the Lower Geyser Basin have lighter isotopes than sulfate-dominant acidic regions. In hot springs with elevated levels of DOC, the range only varies between -20‰ and -26‰ which may be caused by local exogenous organic matter runoff. Combining other geochemical measurements, such as differences in chloride and sulfate concentrations, demonstrates that some regions contain mixtures of multiple fluids moving through the complex hydrological system in the subsurface. The mixing of these fluids may account for increased levels of DOC in meteoric sulfate-dominant acidic regions. Ultimately, the foundational values of dissolved carbon and their isotopic composition is provided in a parkwide study, so results can be combined with future studies that apply different sequencing analyses to understand specific biogeochemical cycling and microbial communities that occur in individual hot springs.
ContributorsBarnes, Tanner (Author) / Shock, Everett (Thesis advisor) / Meyer-Dombard, D'Arcy (Committee member) / Hartnett, Hilairy (Committee member) / Arizona State University (Publisher)
Created2023
189340-Thumbnail Image.png
Description
As air quality standards become more stringent to combat poor air quality, there is a greater need for more effective pollutant control measures and increased air monitoring network coverage. Polluted air, in the form of aerosols and gases, can impact respiratory and cardiovascular health, visibility, the climate, and material weathering.

As air quality standards become more stringent to combat poor air quality, there is a greater need for more effective pollutant control measures and increased air monitoring network coverage. Polluted air, in the form of aerosols and gases, can impact respiratory and cardiovascular health, visibility, the climate, and material weathering. This work demonstrates how traditional networks can be used to study generational events, how these networks can be supplemented with low-cost sensors, and the effectiveness of several control measures. First, an existing network was used to study the effect of COVID-19 travel restrictions on air quality in Maricopa County, Arizona, which would not have been possible without the historical record that a traditional network provides. Although this study determined that decreases in CO and NO2 were not unique to the travel restrictions, it was limited to only three locations due to network sparseness. The second part of this work expanded the traditional NO2 monitoring network using low-cost sensors, that were first collocated with a reference monitor to evaluate their performance and establish a robust calibration. The sensors were then deployed to the field to varying results; their calibration was further improved by cycling the sensors between deployment and reference locations throughout the summer. This calibrated NO2 data, along with volatile organic compound data, were combined to enhance the understanding of ozone formation in Maricopa County, especially during wildfire season. In addition to being in non-attainment for ozone standards, Maricopa County fails to meet particulate matter under 10 μm (PM10) standards. A large portion of PM10 emissions is attributed to fugitive dust that is either windblown or kicked up by vehicles. The third part of this work demonstrated that Enzyme Induced Carbonate Precipitation (EICP) treatments aggregate soil particles and prevent fugitive dust emissions. The final part of the work examined tire wear PM10 emissions, as vehicles are another significant contributor to PM10. Observations showed a decrease in tire wear PM10 during winter with little change when varying the highway surface type.
ContributorsMiech, Jason Andrew (Author) / Herckes, Pierre (Thesis advisor) / Fraser, Matthew P (Committee member) / Shock, Everett (Committee member) / Arizona State University (Publisher)
Created2023
189350-Thumbnail Image.png
Description
The prevalence and unique properties of airborne nanoparticles have raised concerns regarding their potential adverse health effects. Despite their significance, the understanding of nanoparticle generation, transport, and exposure remains incomplete. This study first aimed to assess nanoparticle exposure in indoor workplace environments, in the semiconductor manufacturing industry. On-site observations during

The prevalence and unique properties of airborne nanoparticles have raised concerns regarding their potential adverse health effects. Despite their significance, the understanding of nanoparticle generation, transport, and exposure remains incomplete. This study first aimed to assess nanoparticle exposure in indoor workplace environments, in the semiconductor manufacturing industry. On-site observations during tool preventive maintenance revealed a significant release of particles smaller than 30 nm, which subsequent instrumental analysis confirmed as predominantly composed of transition metals. Although the measured mass concentration levels did not exceed current federal limits, it prompted concerns regarding how well filter-based air sampling methods would capture the particles for exposure assessment and how well common personal protective equipment would protect from exposure. To address these concerns, this study evaluated the capture efficiency of filters and masks. When challenged by aerosolized engineered nanomaterials, common filters used in industrial hygiene sampling exhibited capture efficiencies of over 60%. Filtering Facepiece Respirators, such as the N95 mask, exhibited a capture efficiency of over 98%. In contrast, simple surgical masks showed a capture efficiency of approximately 70%. The experiments showed that face velocity and ambient humidity influence capture performance and mostly identified the critical role of mask and particle surface charge in capturing nanoparticles. Masks with higher surface potential exhibited higher capture efficiency towards nanoparticles. Eliminating their surface charge resulted in a significantly diminished capture efficiency, up to 43%. Finally, this study characterized outdoor nanoparticle concentrations in the Phoenix metropolitan area, revealing typical concentrations on the order of 10^4 #/cm3 consistent with other urban environments. During the North American monsoon season, in dust storms, with elevated number concentrations of large particles, particularly in the size range of 1-10 μm, the number concentration of nanoparticles in the size range of 30-100 nm was substantially lower by approximately 55%. These findings provide valuable insights for future assessments of nanoparticle exposure risks and filter capture mechanisms associated with airborne nanoparticles.
ContributorsZhang, Zhaobo (Author) / Herckes, Pierre (Thesis advisor) / Westerhoff, Paul (Committee member) / Shock, Everett (Committee member) / Fraser, Matthew (Committee member) / Arizona State University (Publisher)
Created2023
168404-Thumbnail Image.png
Description
Communicating with computers through thought has been a remarkable achievement in recent years. This was made possible by the use of Electroencephalography (EEG). Brain-computer interface (BCI) relies heavily on Electroencephalography (EEG) signals for communication between humans and computers. With the advent ofdeep learning, many studies recently applied these techniques to

Communicating with computers through thought has been a remarkable achievement in recent years. This was made possible by the use of Electroencephalography (EEG). Brain-computer interface (BCI) relies heavily on Electroencephalography (EEG) signals for communication between humans and computers. With the advent ofdeep learning, many studies recently applied these techniques to EEG data to perform various tasks like emotion recognition, motor imagery classification, sleep analysis, and many more. Despite the rise of interest in EEG signal classification, very few studies have explored the MindBigData dataset, which collects EEG signals recorded at the stimulus of seeing a digit and thinking about it. This dataset takes us closer to realizing the idea of mind-reading or communication via thought. Thus classifying these signals into the respective digit that the user thinks about is a challenging task. This serves as a motivation to study this dataset and apply existing deep learning techniques to study it. Given the recent success of transformer architecture in different domains like Computer Vision and Natural language processing, this thesis studies transformer architecture for EEG signal classification. Also, it explores other deep learning techniques for the same. As a result, the proposed classification pipeline achieves comparable performance with the existing methods.
ContributorsMuglikar, Omkar Dushyant (Author) / Wang, Yalin (Thesis advisor) / Liang, Jianming (Committee member) / Venkateswara, Hemanth (Committee member) / Arizona State University (Publisher)
Created2021
161945-Thumbnail Image.png
Description
Statistical Shape Modeling is widely used to study the morphometrics of deformable objects in computer vision and biomedical studies. There are mainly two viewpoints to understand the shapes. On one hand, the outer surface of the shape can be taken as a two-dimensional embedding in space. On the other hand,

Statistical Shape Modeling is widely used to study the morphometrics of deformable objects in computer vision and biomedical studies. There are mainly two viewpoints to understand the shapes. On one hand, the outer surface of the shape can be taken as a two-dimensional embedding in space. On the other hand, the outer surface along with its enclosed internal volume can be taken as a three-dimensional embedding of interests. Most studies focus on the surface-based perspective by leveraging the intrinsic features on the tangent plane. But a two-dimensional model may fail to fully represent the realistic properties of shapes with both intrinsic and extrinsic properties. In this thesis, severalStochastic Partial Differential Equations (SPDEs) are thoroughly investigated and several methods are originated from these SPDEs to try to solve the problem of both two-dimensional and three-dimensional shape analyses. The unique physical meanings of these SPDEs inspired the findings of features, shape descriptors, metrics, and kernels in this series of works. Initially, the data generation of high-dimensional shapes, here, the tetrahedral meshes, is introduced. The cerebral cortex is taken as the study target and an automatic pipeline of generating the gray matter tetrahedral mesh is introduced. Then, a discretized Laplace-Beltrami operator (LBO) and a Hamiltonian operator (HO) in tetrahedral domain with Finite Element Method (FEM) are derived. Two high-dimensional shape descriptors are defined based on the solution of the heat equation and Schrödinger’s equation. Considering the fact that high-dimensional shape models usually contain massive redundancies, and the demands on effective landmarks in many applications, a Gaussian process landmarking on tetrahedral meshes is further studied. A SIWKS-based metric space is used to define a geometry-aware Gaussian process. The study of the periodic potential diffusion process further inspired the idea of a new kernel call the geometry-aware convolutional kernel. A series of Bayesian learning methods are then introduced to tackle the problem of shape retrieval and classification. Experiments of every single item are demonstrated. From the popular SPDE such as the heat equation and Schrödinger’s equation to the general potential diffusion equation and the specific periodic potential diffusion equation, it clearly shows that classical SPDEs play an important role in discovering new features, metrics, shape descriptors and kernels. I hope this thesis could be an example of using interdisciplinary knowledge to solve problems.
ContributorsFan, Yonghui (Author) / Wang, Yalin (Thesis advisor) / Lepore, Natasha (Committee member) / Turaga, Pavan (Committee member) / Yang, Yezhou (Committee member) / Arizona State University (Publisher)
Created2021
Description
Graph matching is a fundamental but notoriously difficult problem due to its NP-hard nature, and serves as a cornerstone for a series of applications in machine learning and computer vision, such as image matching, dynamic routing, drug design, to name a few. Although there has been massive previous investigation on

Graph matching is a fundamental but notoriously difficult problem due to its NP-hard nature, and serves as a cornerstone for a series of applications in machine learning and computer vision, such as image matching, dynamic routing, drug design, to name a few. Although there has been massive previous investigation on high-performance graph matching solvers, it still remains a challenging task to tackle the matching problem under real-world scenarios with severe graph uncertainty (e.g., noise, outlier, misleading or ambiguous link).In this dissertation, a main focus is to investigate the essence and propose solutions to graph matching with higher reliability under such uncertainty. To this end, the proposed research was conducted taking into account three perspectives related to reliable graph matching: modeling, optimization and learning. For modeling, graph matching is extended from typical quadratic assignment problem to a more generic mathematical model by introducing a specific family of separable function, achieving higher capacity and reliability. In terms of optimization, a novel high gradient-efficient determinant-based regularization technique is proposed in this research, showing high robustness against outliers. Then learning paradigm for graph matching under intrinsic combinatorial characteristics is explored. First, a study is conducted on the way of filling the gap between discrete problem and its continuous approximation under a deep learning framework. Then this dissertation continues to investigate the necessity of more reliable latent topology of graphs for matching, and propose an effective and flexible framework to obtain it. Coherent findings in this dissertation include theoretical study and several novel algorithms, with rich experiments demonstrating the effectiveness.
ContributorsYu, Tianshu (Author) / Li, Baoxin (Thesis advisor) / Wang, Yalin (Committee member) / Yang, Yezhou (Committee member) / Yang, Yingzhen (Committee member) / Arizona State University (Publisher)
Created2021