Matching Items (133)
149928-Thumbnail Image.png
Description
The technology expansion seen in the last decade for genomics research has permitted the generation of large-scale data sources pertaining to molecular biological assays, genomics, proteomics, transcriptomics and other modern omics catalogs. New methods to analyze, integrate and visualize these data types are essential to unveil relevant disease mechanisms. Towards

The technology expansion seen in the last decade for genomics research has permitted the generation of large-scale data sources pertaining to molecular biological assays, genomics, proteomics, transcriptomics and other modern omics catalogs. New methods to analyze, integrate and visualize these data types are essential to unveil relevant disease mechanisms. Towards these objectives, this research focuses on data integration within two scenarios: (1) transcriptomic, proteomic and functional information and (2) real-time sensor-based measurements motivated by single-cell technology. To assess relationships between protein abundance, transcriptomic and functional data, a nonlinear model was explored at static and temporal levels. The successful integration of these heterogeneous data sources through the stochastic gradient boosted tree approach and its improved predictability are some highlights of this work. Through the development of an innovative validation subroutine based on a permutation approach and the use of external information (i.e., operons), lack of a priori knowledge for undetected proteins was overcome. The integrative methodologies allowed for the identification of undetected proteins for Desulfovibrio vulgaris and Shewanella oneidensis for further biological exploration in laboratories towards finding functional relationships. In an effort to better understand diseases such as cancer at different developmental stages, the Microscale Life Science Center headquartered at the Arizona State University is pursuing single-cell studies by developing novel technologies. This research arranged and applied a statistical framework that tackled the following challenges: random noise, heterogeneous dynamic systems with multiple states, and understanding cell behavior within and across different Barrett's esophageal epithelial cell lines using oxygen consumption curves. These curves were characterized with good empirical fit using nonlinear models with simple structures which allowed extraction of a large number of features. Application of a supervised classification model to these features and the integration of experimental factors allowed for identification of subtle patterns among different cell types visualized through multidimensional scaling. Motivated by the challenges of analyzing real-time measurements, we further explored a unique two-dimensional representation of multiple time series using a wavelet approach which showcased promising results towards less complex approximations. Also, the benefits of external information were explored to improve the image representation.
ContributorsTorres Garcia, Wandaliz (Author) / Meldrum, Deirdre R. (Thesis advisor) / Runger, George C. (Thesis advisor) / Gel, Esma S. (Committee member) / Li, Jing (Committee member) / Zhang, Weiwen (Committee member) / Arizona State University (Publisher)
Created2011
151810-Thumbnail Image.png
Description
Hepatocellular carcinoma (HCC) is a malignant tumor and seventh most common cancer in human. Every year there is a significant rise in the number of patients suffering from HCC. Most clinical research has focused on HCC early detection so that there are high chances of patient's survival. Emerging advancements in

Hepatocellular carcinoma (HCC) is a malignant tumor and seventh most common cancer in human. Every year there is a significant rise in the number of patients suffering from HCC. Most clinical research has focused on HCC early detection so that there are high chances of patient's survival. Emerging advancements in functional and structural imaging techniques have provided the ability to detect microscopic changes in tumor micro environment and micro structure. The prime focus of this thesis is to validate the applicability of advanced imaging modality, Magnetic Resonance Elastography (MRE), for HCC diagnosis. The research was carried out on three HCC patient's data and three sets of experiments were conducted. The main focus was on quantitative aspect of MRE in conjunction with Texture Analysis, an advanced imaging processing pipeline and multi-variate analysis machine learning method for accurate HCC diagnosis. We analyzed the techniques to handle unbalanced data and evaluate the efficacy of sampling techniques. Along with this we studied different machine learning algorithms and developed models using them. Performance metrics such as Prediction Accuracy, Sensitivity and Specificity have been used for evaluation for the final developed model. We were able to identify the significant features in the dataset and also the selected classifier was robust in predicting the response class variable with high accuracy.
ContributorsBansal, Gaurav (Author) / Wu, Teresa (Thesis advisor) / Mitchell, Ross (Thesis advisor) / Li, Jing (Committee member) / Arizona State University (Publisher)
Created2013
152019-Thumbnail Image.png
Description
In this thesis, we present the study of several physical properties of relativistic mat- ters under extreme conditions. We start by deriving the rate of the nonleptonic weak processes and the bulk viscosity in several spin-one color superconducting phases of quark matter. We also calculate the bulk viscosity in the

In this thesis, we present the study of several physical properties of relativistic mat- ters under extreme conditions. We start by deriving the rate of the nonleptonic weak processes and the bulk viscosity in several spin-one color superconducting phases of quark matter. We also calculate the bulk viscosity in the nonlinear and anharmonic regime in the normal phase of strange quark matter. We point out several qualitative effects due to the anharmonicity, although quantitatively they appear to be relatively small. In the corresponding study, we take into account the interplay between the non- leptonic and semileptonic weak processes. The results can be important in order to relate accessible observables of compact stars to their internal composition. We also use quantum field theoretical methods to study the transport properties in monolayer graphene in a strong magnetic field. The corresponding quasi-relativistic system re- veals an anomalous quantum Hall effect, whose features are directly connected with the spontaneous flavor symmetry breaking. We study the microscopic origin of Fara- day rotation and magneto-optical transmission in graphene and show that their main features are in agreement with the experimental data.
ContributorsWang, Xinyang, Ph.D (Author) / Shovkovy, Igor (Thesis advisor) / Belitsky, Andrei (Committee member) / Easson, Damien (Committee member) / Peng, Xihong (Committee member) / Vachaspati, Tanmay (Committee member) / Arizona State University (Publisher)
Created2013
152382-Thumbnail Image.png
Description
A P-value based method is proposed for statistical monitoring of various types of profiles in phase II. The performance of the proposed method is evaluated by the average run length criterion under various shifts in the intercept, slope and error standard deviation of the model. In our proposed approach, P-values

A P-value based method is proposed for statistical monitoring of various types of profiles in phase II. The performance of the proposed method is evaluated by the average run length criterion under various shifts in the intercept, slope and error standard deviation of the model. In our proposed approach, P-values are computed at each level within a sample. If at least one of the P-values is less than a pre-specified significance level, the chart signals out-of-control. The primary advantage of our approach is that only one control chart is required to monitor several parameters simultaneously: the intercept, slope(s), and the error standard deviation. A comprehensive comparison of the proposed method and the existing KMW-Shewhart method for monitoring linear profiles is conducted. In addition, the effect that the number of observations within a sample has on the performance of the proposed method is investigated. The proposed method was also compared to the T^2 method discussed in Kang and Albin (2000) for multivariate, polynomial, and nonlinear profiles. A simulation study shows that overall the proposed P-value method performs satisfactorily for different profile types.
ContributorsAdibi, Azadeh (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie (Thesis advisor) / Li, Jing (Committee member) / Zhang, Muhong (Committee member) / Arizona State University (Publisher)
Created2013
150890-Thumbnail Image.png
Description
Numerical simulations are very helpful in understanding the physics of the formation of structure and galaxies. However, it is sometimes difficult to interpret model data with respect to observations, partly due to the difficulties and background noise inherent to observation. The goal, here, is to attempt to bridge this ga

Numerical simulations are very helpful in understanding the physics of the formation of structure and galaxies. However, it is sometimes difficult to interpret model data with respect to observations, partly due to the difficulties and background noise inherent to observation. The goal, here, is to attempt to bridge this gap between simulation and observation by rendering the model output in image format which is then processed by tools commonly used in observational astronomy. Images are synthesized in various filters by folding the output of cosmological simulations of gasdynamics with star-formation and dark matter with the Bruzual- Charlot stellar population synthesis models. A variation of the Virgo-Gadget numerical simulation code is used with the hybrid gas and stellar formation models of Springel and Hernquist (2003). Outputs taken at various redshifts are stacked to create a synthetic view of the simulated star clusters. Source Extractor (SExtractor) is used to find groupings of stellar populations which are considered as galaxies or galaxy building blocks and photometry used to estimate the rest frame luminosities and distribution functions. With further refinements, this is expected to provide support for missions such as JWST, as well as to probe what additional physics are needed to model the data. The results show good agreement in many respects with observed properties of the galaxy luminosity function (LF) over a wide range of high redshifts. In particular, the slope (alpha) when fitted to the standard Schechter function shows excellent agreement both in value and evolution with redshift, when compared with observation. Discrepancies of other properties with observation are seen to be a result of limitations of the simulation and additional feedback mechanisms which are needed.
ContributorsMorgan, Robert (Author) / Windhorst, Rogier A (Thesis advisor) / Scannapieco, Evan (Committee member) / Rhoads, James (Committee member) / Gardner, Carl (Committee member) / Belitsky, Andrei (Committee member) / Arizona State University (Publisher)
Created2012
150947-Thumbnail Image.png
Description
Understanding the temperature structure of protoplanetary disks (PPDs) is paramount to modeling disk evolution and future planet formation. PPDs around T Tauri stars have two primary heating sources, protostellar irradiation, which depends on the flaring of the disk, and accretional heating as viscous coupling between annuli dissipate energy. I have

Understanding the temperature structure of protoplanetary disks (PPDs) is paramount to modeling disk evolution and future planet formation. PPDs around T Tauri stars have two primary heating sources, protostellar irradiation, which depends on the flaring of the disk, and accretional heating as viscous coupling between annuli dissipate energy. I have written a "1.5-D" radiative transfer code to calculate disk temperatures assuming hydrostatic and radiative equilibrium. The model solves for the temperature at all locations simultaneously using Rybicki's method, converges rapidly at high optical depth, and retains full frequency dependence. The likely cause of accretional heating in PPDs is the magnetorotational instability (MRI), which acts where gas ionization is sufficiently high for gas to couple to the magnetic field. This will occur in surface layers of the disk, leaving the interior portions of the disk inactive ("dead zone"). I calculate temperatures in PPDs undergoing such "layered accretion." Since the accretional heating is concentrated far from the midplane, temperatures in the disk's interior are lower than in PPDs modeled with vertically uniform accretion. The method is used to study for the first time disks evolving via the magnetorotational instability, which operates primarily in surface layers. I find that temperatures in layered accretion disks do not significantly differ from those of "passive disks," where no accretional heating exists. Emergent spectra are insensitive to active layer thickness, making it difficult to observationally identify disks undergoing layered vs. uniform accretion. I also calculate the ionization chemistry in PPDs, using an ionization network including multiple charge states of dust grains. Combined with a criterion for the onset of the MRI, I calculate where the MRI can be initiated and the extent of dead zones in PPDs. After accounting for feedback between temperature and active layer thickness, I find the surface density of the actively accreting layers falls rapidly with distance from the protostar, leading to a net outward flow of mass from ~0.1 to 3 AU. The clearing out of the innermost zones is possibly consistent with the observed behavior of recently discovered "transition disks."
ContributorsLesniak, Michael V., III (Author) / Desch, Steven J. (Thesis advisor) / Scannapieco, Evan (Committee member) / Timmes, Francis (Committee member) / Starrfield, Sumner (Committee member) / Belitsky, Andrei (Committee member) / Arizona State University (Publisher)
Created2012
150778-Thumbnail Image.png
Description
This thesis deals with the first measurements done with a cold neutron beam at the Spallation Neutron Source at Oak Ridge National Laboratory. The experimental technique consisted of capturing polarized cold neutrons by nuclei to measure parity-violation in the angular distribution of the gamma rays following neutron capture. The measurements

This thesis deals with the first measurements done with a cold neutron beam at the Spallation Neutron Source at Oak Ridge National Laboratory. The experimental technique consisted of capturing polarized cold neutrons by nuclei to measure parity-violation in the angular distribution of the gamma rays following neutron capture. The measurements presented here for the nuclei Chlorine ( 35Cl) and Aluminum ( 27Al ) are part of a program with the ultimate goal of measuring the asymmetry in the angular distribution of gamma rays emitted in the capture of neutrons on protons, with a precision better than 10-8, in order to extract the weak hadronic coupling constant due to pion exchange interaction with isospin change equal with one ( hπ 1). Based on theoretical calculations asymmetry in the angular distribution of the gamma rays from neutron capture on protons has an estimated size of 5·10-8. This implies that the Al parity violation asymmetry and its uncertainty have to be known with a precision smaller than 4·10-8. The proton target is liquid Hydrogen (H2) contained in an Aluminum vessel. Results are presented for parity violation and parity-conserving asymmetries in Chlorine and Aluminum. The systematic and statistical uncertainties in the calculation of the parity-violating and parity-conserving asymmetries are discussed.
ContributorsBalascuta, Septimiu (Author) / Alarcon, Ricardo (Thesis advisor) / Belitsky, Andrei (Committee member) / Doak, Bruce (Committee member) / Comfort, Joseph (Committee member) / Schmidt, Kevin (Committee member) / Arizona State University (Publisher)
Created2012
151176-Thumbnail Image.png
Description
Rapid advance in sensor and information technology has resulted in both spatially and temporally data-rich environment, which creates a pressing need for us to develop novel statistical methods and the associated computational tools to extract intelligent knowledge and informative patterns from these massive datasets. The statistical challenges for addressing these

Rapid advance in sensor and information technology has resulted in both spatially and temporally data-rich environment, which creates a pressing need for us to develop novel statistical methods and the associated computational tools to extract intelligent knowledge and informative patterns from these massive datasets. The statistical challenges for addressing these massive datasets lay in their complex structures, such as high-dimensionality, hierarchy, multi-modality, heterogeneity and data uncertainty. Besides the statistical challenges, the associated computational approaches are also considered essential in achieving efficiency, effectiveness, as well as the numerical stability in practice. On the other hand, some recent developments in statistics and machine learning, such as sparse learning, transfer learning, and some traditional methodologies which still hold potential, such as multi-level models, all shed lights on addressing these complex datasets in a statistically powerful and computationally efficient way. In this dissertation, we identify four kinds of general complex datasets, including "high-dimensional datasets", "hierarchically-structured datasets", "multimodality datasets" and "data uncertainties", which are ubiquitous in many domains, such as biology, medicine, neuroscience, health care delivery, manufacturing, etc. We depict the development of novel statistical models to analyze complex datasets which fall under these four categories, and we show how these models can be applied to some real-world applications, such as Alzheimer's disease research, nursing care process, and manufacturing.
ContributorsHuang, Shuai (Author) / Li, Jing (Thesis advisor) / Askin, Ronald (Committee member) / Ye, Jieping (Committee member) / Runger, George C. (Committee member) / Arizona State University (Publisher)
Created2012
136091-Thumbnail Image.png
Description
Some of the most talented, innovative, and experimental artists are students, but they are often discouraged by the price of higher education and lack of scholarship or funding opportunities. Additionally, the art industry has become stagnant. Traditional brick-and-mortar galleries are not willing to represent young, unknown artists. Their overhead is

Some of the most talented, innovative, and experimental artists are students, but they are often discouraged by the price of higher education and lack of scholarship or funding opportunities. Additionally, the art industry has become stagnant. Traditional brick-and-mortar galleries are not willing to represent young, unknown artists. Their overhead is simply too high for risky choices.
The Student Art Project is art patronage for the 21st century—a curated online gallery featuring exceptional student artists. The Student Art Project is a highly curated experience for buyers. Only five artists are featured each month. Buyers are not bombarded with thousands of different products and separate artists “shops”. They can read artists bios and find art they connect with.
Student artists apply through an online form. Once accepted to the program, artists receive a $200 materials stipend to create an exclusive collection of 5-10 pieces. Original artwork and limited edition prints are sold through our website. These collections can potentially fund an entire year of college tuition, a life-changing amount for many students.
Brick-and-mortar galleries typically take 40-60% of the retail price of artwork. The Student Art Project will only take 30%, which we will use to reinvest in future artists. Other art websites, like Etsy, require the artists to ship, invoice, and communicate with customers. For students, this means less time spent in the classroom and less time developing their craft. The Student Art Project handles all business functions for our artists, allowing them to concentrate on what really matters, their education.
ContributorsDangler, Rebecca Leigh (Author) / Trujillo, Rhett (Thesis director) / Coleman, Sean (Committee member) / Barrett, The Honors College (Contributor) / Herberger Institute for Design and the Arts (Contributor) / Department of Management (Contributor)
Created2015-05
136108-Thumbnail Image.png
Description
Drought is one of the most pressing issues affecting the future of the standard of living here in Phoenix. With the threat of water rationing and steep price hikes looming on the horizon for water customers in California, the desert southwest, and in drought-stricken communities worldwide, industrial designers are in

Drought is one of the most pressing issues affecting the future of the standard of living here in Phoenix. With the threat of water rationing and steep price hikes looming on the horizon for water customers in California, the desert southwest, and in drought-stricken communities worldwide, industrial designers are in a prime position to help improve the experience of water conservation so that consumers are willing to start taking conscious steps toward rethinking their relationship with water usage.
In a research group, several designers sought to understand the depth and complexity of this highly politicized issue by interviewing a wide variety of stakeholders, including sustainability experts, landscapers, water company executives, small business owners, reservoir forest rangers, and many more. Data synthesis led to the conclusion that residential water use is a lifestyle issue, and the only real way to conserve involves a significant shift in the collective idea of an “ideal” home—lawns, pools, and overwatered landscaping contribute to 70% of all water use by residences in the Phoenix area. The only real way to conserve involves increasing population density and creating communal green spaces.
DR. DISH is a dishwashing device that is meant to fit into the high-density living spaces that are rapidly being built in the face of the massive exodus of people into the world’s cities. To help busy apartment and condominium dwellers conserve water and time, DR. DISH converts a standard kitchen sink into a small dishwasher, which uses significantly less water than hand-washing dishes or rinsing dishes before putting them into a conventional dishwasher. Using advanced filtration technology and a powerful rinse cycle, a load dishes can be cleaned with about 2 gallons of water. Fully automating the dishwashing process also saves the user time and minimizes unpleasant contact with food residue and grease.
This device is meant to have a significant impact upon the water use of households that do not have a dishwasher, or simply do not use their dishwasher. With a low target price point and myriad convenient features, DR. DISH is a high-tech solution that promises water savings at a time when every effort toward conservation is absolutely critical. As we move toward a new era in determining water rights and imposing mandatory restrictions upon each and every person living in affected areas, creating conservation solutions that will be relevant for the lifestyles of the future is especially important, and the agility of designers in coming up with products that quickly cut consumer water consumption will be a key factor in determining whether humanity will be able to adapt to a new era in our relationship with natural resources.
ContributorsMarcinkowski, Margaret Nicole (Author) / Shin, Dosun (Thesis director) / McDermott, Lauren (Committee member) / Barrett, The Honors College (Contributor) / The Design School (Contributor) / Herberger Institute for Design and the Arts (Contributor)
Created2015-05