Matching Items (97)
Filtering by

Clear all filters

149928-Thumbnail Image.png
Description
The technology expansion seen in the last decade for genomics research has permitted the generation of large-scale data sources pertaining to molecular biological assays, genomics, proteomics, transcriptomics and other modern omics catalogs. New methods to analyze, integrate and visualize these data types are essential to unveil relevant disease mechanisms. Towards

The technology expansion seen in the last decade for genomics research has permitted the generation of large-scale data sources pertaining to molecular biological assays, genomics, proteomics, transcriptomics and other modern omics catalogs. New methods to analyze, integrate and visualize these data types are essential to unveil relevant disease mechanisms. Towards these objectives, this research focuses on data integration within two scenarios: (1) transcriptomic, proteomic and functional information and (2) real-time sensor-based measurements motivated by single-cell technology. To assess relationships between protein abundance, transcriptomic and functional data, a nonlinear model was explored at static and temporal levels. The successful integration of these heterogeneous data sources through the stochastic gradient boosted tree approach and its improved predictability are some highlights of this work. Through the development of an innovative validation subroutine based on a permutation approach and the use of external information (i.e., operons), lack of a priori knowledge for undetected proteins was overcome. The integrative methodologies allowed for the identification of undetected proteins for Desulfovibrio vulgaris and Shewanella oneidensis for further biological exploration in laboratories towards finding functional relationships. In an effort to better understand diseases such as cancer at different developmental stages, the Microscale Life Science Center headquartered at the Arizona State University is pursuing single-cell studies by developing novel technologies. This research arranged and applied a statistical framework that tackled the following challenges: random noise, heterogeneous dynamic systems with multiple states, and understanding cell behavior within and across different Barrett's esophageal epithelial cell lines using oxygen consumption curves. These curves were characterized with good empirical fit using nonlinear models with simple structures which allowed extraction of a large number of features. Application of a supervised classification model to these features and the integration of experimental factors allowed for identification of subtle patterns among different cell types visualized through multidimensional scaling. Motivated by the challenges of analyzing real-time measurements, we further explored a unique two-dimensional representation of multiple time series using a wavelet approach which showcased promising results towards less complex approximations. Also, the benefits of external information were explored to improve the image representation.
ContributorsTorres Garcia, Wandaliz (Author) / Meldrum, Deirdre R. (Thesis advisor) / Runger, George C. (Thesis advisor) / Gel, Esma S. (Committee member) / Li, Jing (Committee member) / Zhang, Weiwen (Committee member) / Arizona State University (Publisher)
Created2011
151810-Thumbnail Image.png
Description
Hepatocellular carcinoma (HCC) is a malignant tumor and seventh most common cancer in human. Every year there is a significant rise in the number of patients suffering from HCC. Most clinical research has focused on HCC early detection so that there are high chances of patient's survival. Emerging advancements in

Hepatocellular carcinoma (HCC) is a malignant tumor and seventh most common cancer in human. Every year there is a significant rise in the number of patients suffering from HCC. Most clinical research has focused on HCC early detection so that there are high chances of patient's survival. Emerging advancements in functional and structural imaging techniques have provided the ability to detect microscopic changes in tumor micro environment and micro structure. The prime focus of this thesis is to validate the applicability of advanced imaging modality, Magnetic Resonance Elastography (MRE), for HCC diagnosis. The research was carried out on three HCC patient's data and three sets of experiments were conducted. The main focus was on quantitative aspect of MRE in conjunction with Texture Analysis, an advanced imaging processing pipeline and multi-variate analysis machine learning method for accurate HCC diagnosis. We analyzed the techniques to handle unbalanced data and evaluate the efficacy of sampling techniques. Along with this we studied different machine learning algorithms and developed models using them. Performance metrics such as Prediction Accuracy, Sensitivity and Specificity have been used for evaluation for the final developed model. We were able to identify the significant features in the dataset and also the selected classifier was robust in predicting the response class variable with high accuracy.
ContributorsBansal, Gaurav (Author) / Wu, Teresa (Thesis advisor) / Mitchell, Ross (Thesis advisor) / Li, Jing (Committee member) / Arizona State University (Publisher)
Created2013
152382-Thumbnail Image.png
Description
A P-value based method is proposed for statistical monitoring of various types of profiles in phase II. The performance of the proposed method is evaluated by the average run length criterion under various shifts in the intercept, slope and error standard deviation of the model. In our proposed approach, P-values

A P-value based method is proposed for statistical monitoring of various types of profiles in phase II. The performance of the proposed method is evaluated by the average run length criterion under various shifts in the intercept, slope and error standard deviation of the model. In our proposed approach, P-values are computed at each level within a sample. If at least one of the P-values is less than a pre-specified significance level, the chart signals out-of-control. The primary advantage of our approach is that only one control chart is required to monitor several parameters simultaneously: the intercept, slope(s), and the error standard deviation. A comprehensive comparison of the proposed method and the existing KMW-Shewhart method for monitoring linear profiles is conducted. In addition, the effect that the number of observations within a sample has on the performance of the proposed method is investigated. The proposed method was also compared to the T^2 method discussed in Kang and Albin (2000) for multivariate, polynomial, and nonlinear profiles. A simulation study shows that overall the proposed P-value method performs satisfactorily for different profile types.
ContributorsAdibi, Azadeh (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie (Thesis advisor) / Li, Jing (Committee member) / Zhang, Muhong (Committee member) / Arizona State University (Publisher)
Created2013
151176-Thumbnail Image.png
Description
Rapid advance in sensor and information technology has resulted in both spatially and temporally data-rich environment, which creates a pressing need for us to develop novel statistical methods and the associated computational tools to extract intelligent knowledge and informative patterns from these massive datasets. The statistical challenges for addressing these

Rapid advance in sensor and information technology has resulted in both spatially and temporally data-rich environment, which creates a pressing need for us to develop novel statistical methods and the associated computational tools to extract intelligent knowledge and informative patterns from these massive datasets. The statistical challenges for addressing these massive datasets lay in their complex structures, such as high-dimensionality, hierarchy, multi-modality, heterogeneity and data uncertainty. Besides the statistical challenges, the associated computational approaches are also considered essential in achieving efficiency, effectiveness, as well as the numerical stability in practice. On the other hand, some recent developments in statistics and machine learning, such as sparse learning, transfer learning, and some traditional methodologies which still hold potential, such as multi-level models, all shed lights on addressing these complex datasets in a statistically powerful and computationally efficient way. In this dissertation, we identify four kinds of general complex datasets, including "high-dimensional datasets", "hierarchically-structured datasets", "multimodality datasets" and "data uncertainties", which are ubiquitous in many domains, such as biology, medicine, neuroscience, health care delivery, manufacturing, etc. We depict the development of novel statistical models to analyze complex datasets which fall under these four categories, and we show how these models can be applied to some real-world applications, such as Alzheimer's disease research, nursing care process, and manufacturing.
ContributorsHuang, Shuai (Author) / Li, Jing (Thesis advisor) / Askin, Ronald (Committee member) / Ye, Jieping (Committee member) / Runger, George C. (Committee member) / Arizona State University (Publisher)
Created2012
135560-Thumbnail Image.png
Description
This thesis explores and analyzes the emergence of for-profit stem cell clinics in the United States, specifically in the Phoenix metropolitan area. Stem cell therapy is an emerging field that has great potential in preventing or treating a number of diseases. Certain companies are currently researching the application of stem

This thesis explores and analyzes the emergence of for-profit stem cell clinics in the United States, specifically in the Phoenix metropolitan area. Stem cell therapy is an emerging field that has great potential in preventing or treating a number of diseases. Certain companies are currently researching the application of stem cells as therapeutics. At present the FDA has only approved one stem cell-based product; however, there are a number of companies currently offering stem cell therapies. In the past five years, most news articles discussing these companies offering stem cell treatments talk of clinics in other countries. Recently, there seems to be a number of stem cell clinics appearing in the United States. Using a web search engine, fourteen stem cell clinics were identified and analyzed in the Phoenix metropolitan area. Each clinic was analyzed by their four key characteristics: business operations, stem cell types, stem cell isolation methods, and their position with the FDA. Based off my analysis, most of the identified clinics are located in Scottsdale or Phoenix. Some of these clinics even share the same location as another medical practice. Each of the fourteen clinics treat more than one type of health condition. The stem clinics make use of four stem cell types and three different isolation methods to obtain the stem cells. The doctors running these clinics almost always treat health conditions outside of their expertise. Some of these clinics even claim they are not subject to FDA regulation.
ContributorsAmrelia, Divya Vikas (Author) / Brafman, David (Thesis director) / Frow, Emma (Committee member) / Harrington Bioengineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135788-Thumbnail Image.png
Description
The Department of Defense (DoD) acquisition system is a complex system riddled with cost and schedule overruns. These cost and schedule overruns are very serious issues as the acquisition system is responsible for aiding U.S. warfighters. Hence, if the acquisition process is failing that could be a potential threat to

The Department of Defense (DoD) acquisition system is a complex system riddled with cost and schedule overruns. These cost and schedule overruns are very serious issues as the acquisition system is responsible for aiding U.S. warfighters. Hence, if the acquisition process is failing that could be a potential threat to our nation's security. Furthermore, the DoD acquisition system is responsible for proper allocation of billions of taxpayer's dollars and employs many civilians and military personnel. Much research has been done in the past on the acquisition system with little impact or success. One reason for this lack of success in improving the system is the lack of accurate models to test theories. This research is a continuation of the effort on the Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation modeling research on DoD acquisition system. We propose to extend ERAM using agent-based simulation principles due to the many interactions among the subsystems of the acquisition system. We initially identify ten sub models needed to simulate the acquisition system. This research focuses on three sub models related to the budget of acquisition programs. In this thesis, we present the data collection, data analysis, initial implementation, and initial validation needed to facilitate these sub models and lay the groundwork for a full agent-based simulation of the DoD acquisition system.
ContributorsBucknell, Sophia Robin (Author) / Wu, Teresa (Thesis director) / Li, Jing (Committee member) / Colombi, John (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
149315-Thumbnail Image.png
Description
In today's global market, companies are facing unprecedented levels of uncertainties in supply, demand and in the economic environment. A critical issue for companies to survive increasing competition is to monitor the changing business environment and manage disturbances and changes in real time. In this dissertation, an integrated framework is

In today's global market, companies are facing unprecedented levels of uncertainties in supply, demand and in the economic environment. A critical issue for companies to survive increasing competition is to monitor the changing business environment and manage disturbances and changes in real time. In this dissertation, an integrated framework is proposed using simulation and online calibration methods to enable the adaptive management of large-scale complex supply chain systems. The design, implementation and verification of the integrated approach are studied in this dissertation. The research contributions are two-fold. First, this work enriches symbiotic simulation methodology by proposing a framework of simulation and advanced data fusion methods to improve simulation accuracy. Data fusion techniques optimally calibrate the simulation state/parameters by considering errors in both the simulation models and in measurements of the real-world system. Data fusion methods - Kalman Filtering, Extended Kalman Filtering, and Ensemble Kalman Filtering - are examined and discussed under varied conditions of system chaotic levels, data quality and data availability. Second, the proposed framework is developed, validated and demonstrated in `proof-of-concept' case studies on representative supply chain problems. In the case study of a simplified supply chain system, Kalman Filtering is applied to fuse simulation data and emulation data to effectively improve the accuracy of the detection of abnormalities. In the case study of the `beer game' supply chain model, the system's chaotic level is identified as a key factor to influence simulation performance and the choice of data fusion method. Ensemble Kalman Filtering is found more robust than Extended Kalman Filtering in a highly chaotic system. With appropriate tuning, the improvement of simulation accuracy is up to 80% in a chaotic system, and 60% in a stable system. In the last study, the integrated framework is applied to adaptive inventory control of a multi-echelon supply chain with non-stationary demand. It is worth pointing out that the framework proposed in this dissertation is not only useful in supply chain management, but also suitable to model other complex dynamic systems, such as healthcare delivery systems and energy consumption networks.
ContributorsWang, Shanshan (Author) / Wu, Teresa (Thesis advisor) / Fowler, John (Thesis advisor) / Pfund, Michele (Committee member) / Li, Jing (Committee member) / Pavlicek, William (Committee member) / Arizona State University (Publisher)
Created2010
148500-Thumbnail Image.png
Description

As life expectancy increases worldwide, age related diseases are becoming greater health concerns. One of the most prevalent age-related diseases in the United States is dementia, with Alzheimer’s disease (AD) being the most common form, accounting for 60-80% of cases. Genetics plays a large role in a person’s risk of

As life expectancy increases worldwide, age related diseases are becoming greater health concerns. One of the most prevalent age-related diseases in the United States is dementia, with Alzheimer’s disease (AD) being the most common form, accounting for 60-80% of cases. Genetics plays a large role in a person’s risk of developing AD. Familial AD, which makes up less than 1% of all AD cases, is caused by autosomal dominant gene mutations and has almost 100% penetrance. Genetic risk factors are believed to make up about 49%-79% of the risk in sporadic cases. Many different genetic risk factors for both familial and sporadic AD have been identified, but there is still much work to be done in the field of AD, especially in non-Caucasian populations. This review summarizes the three major genes responsible for familial AD, namely APP, PSEN1 and PSEN2. Also discussed are seven identified genetic risk factors for sporadic AD, single nucleotide polymorphisms in the APOE, ABCA7, NEDD9, CASS4, PTK2B, CLU, and PICALM genes. An overview of the main function of the proteins associated with the genes is given, along with the supposed connection to AD pathology.

ContributorsRichey, Alexandra Emmeline (Author) / Brafman, David (Thesis director) / Raman, Sreedevi (Committee member) / School of International Letters and Cultures (Contributor) / Harrington Bioengineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
131810-Thumbnail Image.png
Description
Technological applications are continually being developed in the healthcare industry as technology becomes increasingly more available. In recent years, companies have started creating mobile applications to address various conditions and diseases. This falls under mHealth or the “use of mobile phones and other wireless technology in medical care” (Rouse, 2018).

Technological applications are continually being developed in the healthcare industry as technology becomes increasingly more available. In recent years, companies have started creating mobile applications to address various conditions and diseases. This falls under mHealth or the “use of mobile phones and other wireless technology in medical care” (Rouse, 2018). The goal of this study was to identify if data gathered through the use of mHealth methods can be used to build predictive models. The first part of this thesis contains a literature review presenting relevant definitions and several potential studies that involved the use of technology in healthcare applications. The second part of this thesis focuses on data from one study, where regression analysis is used to develop predictive models.

Rouse, M. (2018). mHealth (mobile health). Retrieved from https://searchhealthit.techtarget.com/definition/mHealth
ContributorsAkers, Lindsay (Co-author) / Kiraly, Alyssa (Co-author) / Li, Jing (Thesis director) / Yoon, Hyunsoo (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131790-Thumbnail Image.png
Description
Cell viability is an important assessment in cell culture to characterize the health of the cell population and confirm if cells are alive. Morphology or end-line assays are used to determine cell viability of entire populations. Intracellular pO2 levels is indicative of cell health and metabolism that can be used

Cell viability is an important assessment in cell culture to characterize the health of the cell population and confirm if cells are alive. Morphology or end-line assays are used to determine cell viability of entire populations. Intracellular pO2 levels is indicative of cell health and metabolism that can be used as a factor to asses cell viability in an in-line assay. Siloxane based pO2 sensing nanoprobes present a modality to visualize intracellular pO2. Using fluorescent lifetime imaging microscopy (FLIM), pO2 levels can be mapped intracellular as a highly functional in-line assay for cell viability. FLIM is an imaging modality that reconstructs an image based of its fluorescent lifetime. Nanoprobes were synthesized in different manufacturing/storage conditions. The nanoprobes for both long- and short-term storage were characterized in a cell free environment testing for changes in fluorescent intensity, average and maximum nanoprobe diameter. The nanoprobes were validated in two different culture systems, 2D and microcarrier culture systems, for human derived neural progenitor cells (NPCs) and neurons. Long- and short-term storage nanoprobes were used to label different neuronal based culture systems to asses labeling efficiency through fluorescent microscopy and flow cytometry. NPCs and neurons in each culture system was tested to see if nanoprobe labeling effected cellular phenotype for traits such as: cell proliferation, gene expression, and calcium imaging. Long-term and short-term storage nanoprobes were successfully validated for both NPCs and neurons in all culture systems. Assessments of the pO2 sensing nanoprobes will be further developed to create a highly functional and efficient in-line test for cell viability.
ContributorsLeyasi, Salma (Author) / Brafman, David (Thesis director) / Kodibagkar, Vikram (Committee member) / Harrington Bioengineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2020-05