Matching Items (117)
150288-Thumbnail Image.png
Description
In an effort to begin validating the large number of discovered candidate biomarkers, proteomics is beginning to shift from shotgun proteomic experiments towards targeted proteomic approaches that provide solutions to automation and economic concerns. Such approaches to validate biomarkers necessitate the mass spectrometric analysis of hundreds to thousands of human

In an effort to begin validating the large number of discovered candidate biomarkers, proteomics is beginning to shift from shotgun proteomic experiments towards targeted proteomic approaches that provide solutions to automation and economic concerns. Such approaches to validate biomarkers necessitate the mass spectrometric analysis of hundreds to thousands of human samples. As this takes place, a serendipitous opportunity has become evident. By the virtue that as one narrows the focus towards "single" protein targets (instead of entire proteomes) using pan-antibody-based enrichment techniques, a discovery science has emerged, so to speak. This is due to the largely unknown context in which "single" proteins exist in blood (i.e. polymorphisms, transcript variants, and posttranslational modifications) and hence, targeted proteomics has applications for established biomarkers. Furthermore, besides protein heterogeneity accounting for interferences with conventional immunometric platforms, it is becoming evident that this formerly hidden dimension of structural information also contains rich-pathobiological information. Consequently, targeted proteomics studies that aim to ascertain a protein's genuine presentation within disease- stratified populations and serve as a stepping-stone within a biomarker translational pipeline are of clinical interest. Roughly 128 million Americans are pre-diabetic, diabetic, and/or have kidney disease and public and private spending for treating these diseases is in the hundreds of billions of dollars. In an effort to create new solutions for the early detection and management of these conditions, described herein is the design, development, and translation of mass spectrometric immunoassays targeted towards diabetes and kidney disease. Population proteomics experiments were performed for the following clinically relevant proteins: insulin, C-peptide, RANTES, and parathyroid hormone. At least thirty-eight protein isoforms were detected. Besides the numerous disease correlations confronted within the disease-stratified cohorts, certain isoforms also appeared to be causally related to the underlying pathophysiology and/or have therapeutic implications. Technical advancements include multiplexed isoform quantification as well a "dual- extraction" methodology for eliminating non-specific proteins while simultaneously validating isoforms. Industrial efforts towards widespread clinical adoption are also described. Consequently, this work lays a foundation for the translation of mass spectrometric immunoassays into the clinical arena and simultaneously presents the most recent advancements concerning the mass spectrometric immunoassay approach.
ContributorsOran, Paul (Author) / Nelson, Randall (Thesis advisor) / Hayes, Mark (Thesis advisor) / Ros, Alexandra (Committee member) / Williams, Peter (Committee member) / Arizona State University (Publisher)
Created2011
151436-Thumbnail Image.png
Description
Signal processing techniques have been used extensively in many engineering problems and in recent years its application has extended to non-traditional research fields such as biological systems. Many of these applications require extraction of a signal or parameter of interest from degraded measurements. One such application is mass spectrometry immunoassay

Signal processing techniques have been used extensively in many engineering problems and in recent years its application has extended to non-traditional research fields such as biological systems. Many of these applications require extraction of a signal or parameter of interest from degraded measurements. One such application is mass spectrometry immunoassay (MSIA) which has been one of the primary methods of biomarker discovery techniques. MSIA analyzes protein molecules as potential biomarkers using time of flight mass spectrometry (TOF-MS). Peak detection in TOF-MS is important for biomarker analysis and many other MS related application. Though many peak detection algorithms exist, most of them are based on heuristics models. One of the ways of detecting signal peaks is by deploying stochastic models of the signal and noise observations. Likelihood ratio test (LRT) detector, based on the Neyman-Pearson (NP) lemma, is an uniformly most powerful test to decision making in the form of a hypothesis test. The primary goal of this dissertation is to develop signal and noise models for the electrospray ionization (ESI) TOF-MS data. A new method is proposed for developing the signal model by employing first principles calculations based on device physics and molecular properties. The noise model is developed by analyzing MS data from careful experiments in the ESI mass spectrometer. A non-flat baseline in MS data is common. The reasons behind the formation of this baseline has not been fully comprehended. A new signal model explaining the presence of baseline is proposed, though detailed experiments are needed to further substantiate the model assumptions. Signal detection schemes based on these signal and noise models are proposed. A maximum likelihood (ML) method is introduced for estimating the signal peak amplitudes. The performance of the detection methods and ML estimation are evaluated with Monte Carlo simulation which shows promising results. An application of these methods is proposed for fractional abundance calculation for biomarker analysis, which is mathematically robust and fundamentally different than the current algorithms. Biomarker panels for type 2 diabetes and cardiovascular disease are analyzed using existing MS analysis algorithms. Finally, a support vector machine based multi-classification algorithm is developed for evaluating the biomarkers' effectiveness in discriminating type 2 diabetes and cardiovascular diseases and is shown to perform better than a linear discriminant analysis based classifier.
ContributorsBuddi, Sai (Author) / Taylor, Thomas (Thesis advisor) / Cochran, Douglas (Thesis advisor) / Nelson, Randall (Committee member) / Duman, Tolga (Committee member) / Arizona State University (Publisher)
Created2012
151170-Thumbnail Image.png
Description
Cancer claims hundreds of thousands of lives every year in US alone. Finding ways for early detection of cancer onset is crucial for better management and treatment of cancer. Thus, biomarkers especially protein biomarkers, being the functional units which reflect dynamic physiological changes, need to be discovered. Though important, there

Cancer claims hundreds of thousands of lives every year in US alone. Finding ways for early detection of cancer onset is crucial for better management and treatment of cancer. Thus, biomarkers especially protein biomarkers, being the functional units which reflect dynamic physiological changes, need to be discovered. Though important, there are only a few approved protein cancer biomarkers till date. To accelerate this process, fast, comprehensive and affordable assays are required which can be applied to large population studies. For this, these assays should be able to comprehensively characterize and explore the molecular diversity of nominally "single" proteins across populations. This information is usually unavailable with commonly used immunoassays such as ELISA (enzyme linked immunosorbent assay) which either ignore protein microheterogeneity, or are confounded by it. To this end, mass spectrometric immuno assays (MSIA) for three different human plasma proteins have been developed. These proteins viz. IGF-1, hemopexin and tetranectin have been found in reported literature to show correlations with many diseases along with several carcinomas. Developed assays were used to extract entire proteins from plasma samples and subsequently analyzed on mass spectrometric platforms. Matrix assisted laser desorption ionization (MALDI) and electrospray ionization (ESI) mass spectrometric techniques where used due to their availability and suitability for the analysis. This resulted in visibility of different structural forms of these proteins showing their structural micro-heterogeneity which is invisible to commonly used immunoassays. These assays are fast, comprehensive and can be applied in large sample studies to analyze proteins for biomarker discovery.
ContributorsRai, Samita (Author) / Nelson, Randall (Thesis advisor) / Hayes, Mark (Thesis advisor) / Borges, Chad (Committee member) / Ros, Alexandra (Committee member) / Arizona State University (Publisher)
Created2012
137818-Thumbnail Image.png
Description
This thesis shows analyses of mixing and transport patterns associated with Hurricane Katrina as it hit the United States in August of 2005. Specifically, by applying atmospheric velocity information from the Weather Research and Forecasting System, finite-time Lyapunov exponents have been computed and the Lagrangian Coherent Structures have been identified.

This thesis shows analyses of mixing and transport patterns associated with Hurricane Katrina as it hit the United States in August of 2005. Specifically, by applying atmospheric velocity information from the Weather Research and Forecasting System, finite-time Lyapunov exponents have been computed and the Lagrangian Coherent Structures have been identified. The chaotic dynamics of material transport induced by the hurricane are results from these structures within the flow. Boundaries of the coherent structures are highlighted by the FTLE field. Individual particle transport within the hurricane is affected by the location of these boundaries. In addition to idealized fluid particles, we also studied inertial particles which have finite size and inertia. Basing on established Maxey-Riley equations of the dynamics of particles of finite size, we obtain a reduced equation governing the position process. Using methods derived from computer graphics, we identify maximizers of the FTLE field. Following and applying these ideas, we analyze the dynamics of inertial particle transport within Hurricane Katrina, through comparison of trajectories of dierent sized particles and by pinpointing the location of the Lagrangian Coherent Structures.
ContributorsWake, Christian (Author) / Tang, Wenbo (Thesis director) / Moustaoui, Mohamed (Committee member) / Kostelich, Eric (Committee member) / Barrett, The Honors College (Contributor) / College of Liberal Arts and Sciences (Contributor)
Created2012-12
136845-Thumbnail Image.png
Description
The goal of this project was to examine the separatricies that define regions of distinct flow behaviors in realistic time-dependent dynamical systems. In particular, we adapted previously available methods for computing the Finite-Time Lyapunov Exponent (FTLE) to a set of measured wind velocity data in order to visualize the separatricies

The goal of this project was to examine the separatricies that define regions of distinct flow behaviors in realistic time-dependent dynamical systems. In particular, we adapted previously available methods for computing the Finite-Time Lyapunov Exponent (FTLE) to a set of measured wind velocity data in order to visualize the separatricies as ridges of the FTLE field in a section of the atmosphere. This visualization required a number of alterations to the original methods, including interpolation techniques and two different adaptive refinement schemes for producing more detailed results. Overall, there were two computations performed with the wind velocity data: once along a single spherical surface, on which the separatricies could be visualized as material lines, and then along a three-dimensional section of the atmosphere, for which the separatricies were material surfaces. The resulting figures provide an image of the Antarctic polar vortex from the wind velocity data, which is consistent with other data gathered on the same date.
ContributorsUpton, James Thomas (Author) / Tang, Wenbo (Thesis director) / Moustaoui, Mohamed (Committee member) / Barrett, The Honors College (Contributor) / School of International Letters and Cultures (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2014-05
137507-Thumbnail Image.png
Description
Using weather data from the Weather Research and Forecasting model (WRF), we analyze the transport of inertial particles in Hurricane Katrina in order to identify coherent patterns of motion. For our analysis, we choose a Lagrangian approach instead of an Eulerian approach because the Lagrangian approach is objective and frame-independent,

Using weather data from the Weather Research and Forecasting model (WRF), we analyze the transport of inertial particles in Hurricane Katrina in order to identify coherent patterns of motion. For our analysis, we choose a Lagrangian approach instead of an Eulerian approach because the Lagrangian approach is objective and frame-independent, and gives results which are better defined. In particular, we locate Lagrangian Coherent Structures (LCS), which are smooth sets of fluid particles which are locally most hyperbolic (either attracting or repelling). We implement a variational method for locating LCS and compare the results to previous computation of LCS using Finite-Time Lyapunov Exponents (FTLE) to identify regions of high stretching in the fluid flow.
ContributorsDeibel, Angelica Rae (Author) / Tang, Wenbo (Thesis director) / Moustaoui, Mohamed (Committee member) / Kostelich, Eric (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2013-05
130393-Thumbnail Image.png
Description
Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative

Mathematical epidemiology, one of the oldest and richest areas in mathematical biology, has significantly enhanced our understanding of how pathogens emerge, evolve, and spread. Classical epidemiological models, the standard for predicting and managing the spread of infectious disease, assume that contacts between susceptible and infectious individuals depend on their relative frequency in the population. The behavioral factors that underpin contact rates are not generally addressed. There is, however, an emerging a class of models that addresses the feedbacks between infectious disease dynamics and the behavioral decisions driving host contact. Referred to as “economic epidemiology” or “epidemiological economics,” the approach explores the determinants of decisions about the number and type of contacts made by individuals, using insights and methods from economics. We show how the approach has the potential both to improve predictions of the course of infectious disease, and to support development of novel approaches to infectious disease management.
Created2015-12-01
130394-Thumbnail Image.png
Description

Nutrient recycling by fish can be an important part of nutrient cycles in both freshwater and marine ecosystems. As a result, understanding the mechanisms that influence excretion elemental ratios of fish is of great importance to a complete understanding of aquatic nutrient cycles. As fish consume a wide range of

Nutrient recycling by fish can be an important part of nutrient cycles in both freshwater and marine ecosystems. As a result, understanding the mechanisms that influence excretion elemental ratios of fish is of great importance to a complete understanding of aquatic nutrient cycles. As fish consume a wide range of diets that differ in elemental composition, stoichiometric theory can inform predictions about dietary effects on excretion ratios.
We conducted a meta-analysis to test the effects of diet elemental composition on consumption and nutrient excretion by fish. We examined the relationship between consumption rate and diet N : P across all laboratory studies and calculated effect sizes for each excretion metric to test for significant effects.
Consumption rate of N, but not P, was significantly negatively affected by diet N : P. Effect sizes of diet elemental composition on consumption-specific excretion N, P and N : P in laboratory studies were all significantly different from 0, but effect size for raw excretion N : P was not significantly different from zero in laboratory or field surveys.
Our results highlight the importance of having a mechanistic understanding of the drivers of consumer excretion rates and ratios. We suggest that more research is needed on how consumption and assimilation efficiency vary with N : P and in natural ecosystems in order to further understand mechanistic processes in consumer-driven nutrient recycling.

ContributorsMoody, Eric (Author) / Corman, Jessica (Author) / Elser, James (Author) / Sabo, John (Author) / College of Liberal Arts and Sciences (Contributor) / School of Life Sciences (Contributor) / Julie Ann Wrigley Global Institute of Sustainability (Contributor)
Created2015-03-01
130400-Thumbnail Image.png
Description
Preserving a system’s viability in the presence of diversity erosion is critical if the goal is to sustainably support biodiversity. Reduction in population heterogeneity, whether inter- or intraspecies, may increase population fragility, either decreasing its ability to adapt effectively to environmental changes or facilitating the survival and success of ordinarily

Preserving a system’s viability in the presence of diversity erosion is critical if the goal is to sustainably support biodiversity. Reduction in population heterogeneity, whether inter- or intraspecies, may increase population fragility, either decreasing its ability to adapt effectively to environmental changes or facilitating the survival and success of ordinarily rare phenotypes. The latter may result in over-representation of individuals who may participate in resource utilization patterns that can lead to over-exploitation, exhaustion, and, ultimately, collapse of both the resource and the population that depends on it. Here, we aim to identify regimes that can signal whether a consumer–resource system is capable of supporting viable degrees of heterogeneity. The framework used here is an expansion of a previously introduced consumer–resource type system of a population of individuals classified by their resource consumption. Application of the Reduction Theorem to the system enables us to evaluate the health of the system through tracking both the mean value of the parameter of resource (over)consumption, and the population variance, as both change over time. The article concludes with a discussion that highlights applicability of the proposed system to investigation of systems that are affected by particularly devastating overly adapted populations, namely cancerous cells. Potential intervention approaches for system management are discussed in the context of cancer therapies.
Created2015-02-01
130330-Thumbnail Image.png
Description
Evolving Earth observation and change detection techniques enable the automatic identification of Land Use and Land Cover Change (LULCC) over a large extent from massive amounts of remote sensing data. It at the same time poses a major challenge in effective organization, representation and modeling of such information. This study

Evolving Earth observation and change detection techniques enable the automatic identification of Land Use and Land Cover Change (LULCC) over a large extent from massive amounts of remote sensing data. It at the same time poses a major challenge in effective organization, representation and modeling of such information. This study proposes and implements an integrated computational framework to support the modeling, semantic and spatial reasoning of change information with regard to space, time and topology. We first proposed a conceptual model to formally represent the spatiotemporal variation of change data, which is essential knowledge to support various environmental and social studies, such as deforestation and urbanization studies. Then, a spatial ontology was created to encode these semantic spatiotemporal data in a machine-understandable format. Based on the knowledge defined in the ontology and related reasoning rules, a semantic platform was developed to support the semantic query and change trajectory reasoning of areas with LULCC. This semantic platform is innovative, as it integrates semantic and spatial reasoning into a coherent computational and operational software framework to support automated semantic analysis of time series data that can go beyond LULC datasets. In addition, this system scales well as the amount of data increases, validated by a number of experimental results. This work contributes significantly to both the geospatial Semantic Web and GIScience communities in terms of the establishment of the (web-based) semantic platform for collaborative question answering and decision-making.
Created2016-10-25