Matching Items (35)

127874-Thumbnail Image.png

Metabolic Remodeling of Membrane Glycerolipids in the Microalga Nannochloropsis Oceanica Under Nitrogen Deprivation

Description

The lack of lipidome analytical tools has limited our ability to gain new knowledge about lipid metabolism in microalgae, especially for membrane glycerolipids. An electrospray ionization mass spectrometry-based lipidomics method was developed for Nannochloropsis oceanica IMET1, which resolved 41 membrane

The lack of lipidome analytical tools has limited our ability to gain new knowledge about lipid metabolism in microalgae, especially for membrane glycerolipids. An electrospray ionization mass spectrometry-based lipidomics method was developed for Nannochloropsis oceanica IMET1, which resolved 41 membrane glycerolipids molecular species belonging to eight classes. Changes in membrane glycerolipids under nitrogen deprivation and high-light (HL) conditions were uncovered. The results showed that the amount of plastidial membrane lipids including monogalactosyldiacylglycerol, phosphatidylglycerol, and the extraplastidic lipids diacylglyceryl-O-4′-(N, N, N,-trimethyl) homoserine and phosphatidylcholine decreased drastically under HL and nitrogen deprivation stresses. Algal cells accumulated considerably more digalactosyldiacylglycerol and sulfoquinovosyldiacylglycerols under stresses. The genes encoding enzymes responsible for biosynthesis, modification and degradation of glycerolipids were identified by mining a time-course global RNA-seq data set. It suggested that reduction in lipid contents under nitrogen deprivation is not attributable to the retarded biosynthesis processes, at least at the gene expression level, as most genes involved in their biosynthesis were unaffected by nitrogen supply, yet several genes were significantly up-regulated. Additionally, a conceptual eicosapentaenoic acid (EPA) biosynthesis network is proposed based on the lipidomic and transcriptomic data, which underlined import of EPA from cytosolic glycerolipids to the plastid for synthesizing EPA-containing chloroplast membrane lipids.

Contributors

Agent

Created

Date Created
2017-08-04

128818-Thumbnail Image.png

Multi-Parametric MRI and Texture Analysis to Visualize Spatial Histologic Heterogeneity and Tumor Extent in Glioblastoma

Description

Background: Genetic profiling represents the future of neuro-oncology but suffers from inadequate biopsies in heterogeneous tumors like Glioblastoma (GBM). Contrast-enhanced MRI (CE-MRI) targets enhancing core (ENH) but yields adequate tumor in only ~60% of cases. Further, CE-MRI poorly localizes infiltrative tumor

Background: Genetic profiling represents the future of neuro-oncology but suffers from inadequate biopsies in heterogeneous tumors like Glioblastoma (GBM). Contrast-enhanced MRI (CE-MRI) targets enhancing core (ENH) but yields adequate tumor in only ~60% of cases. Further, CE-MRI poorly localizes infiltrative tumor within surrounding non-enhancing parenchyma, or brain-around-tumor (BAT), despite the importance of characterizing this tumor segment, which universally recurs. In this study, we use multiple texture analysis and machine learning (ML) algorithms to analyze multi-parametric MRI, and produce new images indicating tumor-rich targets in GBM.

Methods: We recruited primary GBM patients undergoing image-guided biopsies and acquired pre-operative MRI: CE-MRI, Dynamic-Susceptibility-weighted-Contrast-enhanced-MRI, and Diffusion Tensor Imaging. Following image coregistration and region of interest placement at biopsy locations, we compared MRI metrics and regional texture with histologic diagnoses of high- vs low-tumor content (≥80% vs <80% tumor nuclei) for corresponding samples. In a training set, we used three texture analysis algorithms and three ML methods to identify MRI-texture features that optimized model accuracy to distinguish tumor content. We confirmed model accuracy in a separate validation set.

Results: We collected 82 biopsies from 18 GBMs throughout ENH and BAT. The MRI-based model achieved 85% cross-validated accuracy to diagnose high- vs low-tumor in the training set (60 biopsies, 11 patients). The model achieved 81.8% accuracy in the validation set (22 biopsies, 7 patients).

Conclusion: Multi-parametric MRI and texture analysis can help characterize and visualize GBM’s spatial histologic heterogeneity to identify regional tumor-rich biopsy targets.

Contributors

Agent

Created

Date Created
2015-11-24

135788-Thumbnail Image.png

Cost Driven Agent Based Simulation of the Department of Defense Acquisition System

Description

The Department of Defense (DoD) acquisition system is a complex system riddled with cost and schedule overruns. These cost and schedule overruns are very serious issues as the acquisition system is responsible for aiding U.S. warfighters. Hence, if the acquisition

The Department of Defense (DoD) acquisition system is a complex system riddled with cost and schedule overruns. These cost and schedule overruns are very serious issues as the acquisition system is responsible for aiding U.S. warfighters. Hence, if the acquisition process is failing that could be a potential threat to our nation's security. Furthermore, the DoD acquisition system is responsible for proper allocation of billions of taxpayer's dollars and employs many civilians and military personnel. Much research has been done in the past on the acquisition system with little impact or success. One reason for this lack of success in improving the system is the lack of accurate models to test theories. This research is a continuation of the effort on the Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation modeling research on DoD acquisition system. We propose to extend ERAM using agent-based simulation principles due to the many interactions among the subsystems of the acquisition system. We initially identify ten sub models needed to simulate the acquisition system. This research focuses on three sub models related to the budget of acquisition programs. In this thesis, we present the data collection, data analysis, initial implementation, and initial validation needed to facilitate these sub models and lay the groundwork for a full agent-based simulation of the DoD acquisition system.

Contributors

Agent

Created

Date Created
2016-05

MRI-Based Texture Analysis to Differentiate Sinonasal Squamous Cell Carcinoma from Inverted Papilloma

Description

ABSTRACT BACKGROUND AND PURPOSE: Sinonasal inverted papilloma (IP) can harbor squamous cell carcinoma (SCC). Consequently, differentiating these tumors is important. The objective of this study was to determine if MRI-based texture analysis can differentiate SCC from IP and provide supplementary

ABSTRACT BACKGROUND AND PURPOSE: Sinonasal inverted papilloma (IP) can harbor squamous cell carcinoma (SCC). Consequently, differentiating these tumors is important. The objective of this study was to determine if MRI-based texture analysis can differentiate SCC from IP and provide supplementary information to the radiologist. MATERIALS AND METHODS: Adult patients who had IP or SCC resected were eligible (coexistent IP and SCC were excluded). Inclusion required tumor size greater than 1.5 cm and a pre-operative MRI with axial T1, axial T2, and axial T1 post-contrast sequences. Five well- established texture analysis algorithms were applied to an ROI from the largest tumor cross- section. For a training dataset, machine-learning algorithms were used to identify the most accurate model, and performance was also evaluated in a validation dataset. Based on three separate blinded reviews of the ROI, isolated tumor, and entire images, two neuroradiologists predicted tumor type in consensus. RESULTS: The IP and SCC cohorts were matched for age and gender, while SCC tumor volume was larger (p=0.001). The best classification model achieved similar accuracies for training (17 SCC, 16 IP) and validation (7 SCC, 6 IP) datasets of 90.9% and 84.6% respectively (p=0.537). The machine-learning accuracy for the entire cohort (89.1%) was better than that of the neuroradiologist ROI review (56.5%, p=0.0004) but not significantly different from the neuroradiologist review of the tumors (73.9%, p=0.060) or entire images (87.0%, p=0.748). CONCLUSION: MRI-based texture analysis has potential to differentiate SCC from IP and may provide incremental information to the neuroradiologist, particularly for small or heterogeneous tumors.

Contributors

Agent

Created

Date Created
2016-12

152382-Thumbnail Image.png

A P-value based approach for phase II profile monitoring

Description

A P-value based method is proposed for statistical monitoring of various types of profiles in phase II. The performance of the proposed method is evaluated by the average run length criterion under various shifts in the intercept, slope and error

A P-value based method is proposed for statistical monitoring of various types of profiles in phase II. The performance of the proposed method is evaluated by the average run length criterion under various shifts in the intercept, slope and error standard deviation of the model. In our proposed approach, P-values are computed at each level within a sample. If at least one of the P-values is less than a pre-specified significance level, the chart signals out-of-control. The primary advantage of our approach is that only one control chart is required to monitor several parameters simultaneously: the intercept, slope(s), and the error standard deviation. A comprehensive comparison of the proposed method and the existing KMW-Shewhart method for monitoring linear profiles is conducted. In addition, the effect that the number of observations within a sample has on the performance of the proposed method is investigated. The proposed method was also compared to the T^2 method discussed in Kang and Albin (2000) for multivariate, polynomial, and nonlinear profiles. A simulation study shows that overall the proposed P-value method performs satisfactorily for different profile types.

Contributors

Agent

Created

Date Created
2013

131810-Thumbnail Image.png

mHealth Patient Care Improvement Study Through Statistical Analysis

Description

Technological applications are continually being developed in the healthcare industry as technology becomes increasingly more available. In recent years, companies have started creating mobile applications to address various conditions and diseases. This falls under mHealth or the “use of mobile

Technological applications are continually being developed in the healthcare industry as technology becomes increasingly more available. In recent years, companies have started creating mobile applications to address various conditions and diseases. This falls under mHealth or the “use of mobile phones and other wireless technology in medical care” (Rouse, 2018). The goal of this study was to identify if data gathered through the use of mHealth methods can be used to build predictive models. The first part of this thesis contains a literature review presenting relevant definitions and several potential studies that involved the use of technology in healthcare applications. The second part of this thesis focuses on data from one study, where regression analysis is used to develop predictive models.

Rouse, M. (2018). mHealth (mobile health). Retrieved from https://searchhealthit.techtarget.com/definition/mHealth

Contributors

Agent

Created

Date Created
2020-05

153065-Thumbnail Image.png

A model fusion based framework for imbalanced classification problem with noisy dataset

Description

Data imbalance and data noise often coexist in real world datasets. Data imbalance affects the learning classifier by degrading the recognition power of the classifier on the minority class, while data noise affects the learning classifier by providing inaccurate information

Data imbalance and data noise often coexist in real world datasets. Data imbalance affects the learning classifier by degrading the recognition power of the classifier on the minority class, while data noise affects the learning classifier by providing inaccurate information and thus misleads the classifier. Because of these differences, data imbalance and data noise have been treated separately in the data mining field. Yet, such approach ignores the mutual effects and as a result may lead to new problems. A desirable solution is to tackle these two issues jointly. Noting the complementary nature of generative and discriminative models, this research proposes a unified model fusion based framework to handle the imbalanced classification with noisy dataset.

The phase I study focuses on the imbalanced classification problem. A generative classifier, Gaussian Mixture Model (GMM) is studied which can learn the distribution of the imbalance data to improve the discrimination power on imbalanced classes. By fusing this knowledge into cost SVM (cSVM), a CSG method is proposed. Experimental results show the effectiveness of CSG in dealing with imbalanced classification problems.

The phase II study expands the research scope to include the noisy dataset into the imbalanced classification problem. A model fusion based framework, K Nearest Gaussian (KNG) is proposed. KNG employs a generative modeling method, GMM, to model the training data as Gaussian mixtures and form adjustable confidence regions which are less sensitive to data imbalance and noise. Motivated by the K-nearest neighbor algorithm, the neighboring Gaussians are used to classify the testing instances. Experimental results show KNG method greatly outperforms traditional classification methods in dealing with imbalanced classification problems with noisy dataset.

The phase III study addresses the issues of feature selection and parameter tuning of KNG algorithm. To further improve the performance of KNG algorithm, a Particle Swarm Optimization based method (PSO-KNG) is proposed. PSO-KNG formulates model parameters and data features into the same particle vector and thus can search the best feature and parameter combination jointly. The experimental results show that PSO can greatly improve the performance of KNG with better accuracy and much lower computational cost.

Contributors

Agent

Created

Date Created
2014

154329-Thumbnail Image.png

Privacy-preserving mobile crowd sensing

Description

The presence of a rich set of embedded sensors on mobile devices has been fuelling various sensing applications regarding the activities of individuals and their surrounding environment, and these ubiquitous sensing-capable mobile devices are pushing the new paradigm of Mobile

The presence of a rich set of embedded sensors on mobile devices has been fuelling various sensing applications regarding the activities of individuals and their surrounding environment, and these ubiquitous sensing-capable mobile devices are pushing the new paradigm of Mobile Crowd Sensing (MCS) from concept to reality. MCS aims to outsource sensing data collection to mobile users and it could revolutionize the traditional ways of sensing data collection and processing. In the meantime, cloud computing provides cloud-backed infrastructures for mobile devices to provision their capabilities with network access. With enormous computational and storage resources along with sufficient bandwidth, it functions as the hub to handle the sensing service requests from sensing service consumers and coordinate sensing task assignment among eligible mobile users to reach a desired quality of sensing service. This paper studies the problem of sensing task assignment to mobile device owners with specific spatio-temporal traits to minimize the cost and maximize the utility in MCS while adhering to QoS constraints. Greedy approaches and hybrid solutions combined with bee algorithms are explored to address the problem.

Moreover, the privacy concerns arise with the widespread deployment of MCS from both the data contributors and the sensing service consumers. The uploaded sensing data, especially those tagged with spatio-temporal information, will disclose the personal information of the data contributors. In addition, the sensing service requests can reveal the personal interests of service consumers. To address the privacy issues, this paper constructs a new framework named Privacy-Preserving Mobile Crowd Sensing (PP-MCS) to leverage the sensing capabilities of ubiquitous mobile devices and cloud infrastructures. PP-MCS has a distributed architecture without relying on trusted third parties for privacy-preservation. In PP-MCS, the sensing service consumers can retrieve data without revealing the real data contributors. Besides, the individual sensing records can be compared against the aggregation result while keeping the values of sensing records unknown, and the k-nearest neighbors could be approximately identified without privacy leaks. As such, the privacy of the data contributors and the sensing service consumers can be protected to the greatest extent possible.

Contributors

Agent

Created

Date Created
2016

149928-Thumbnail Image.png

Integrative analyses of diverse biological data sources

Description

The technology expansion seen in the last decade for genomics research has permitted the generation of large-scale data sources pertaining to molecular biological assays, genomics, proteomics, transcriptomics and other modern omics catalogs. New methods to analyze, integrate and visualize these

The technology expansion seen in the last decade for genomics research has permitted the generation of large-scale data sources pertaining to molecular biological assays, genomics, proteomics, transcriptomics and other modern omics catalogs. New methods to analyze, integrate and visualize these data types are essential to unveil relevant disease mechanisms. Towards these objectives, this research focuses on data integration within two scenarios: (1) transcriptomic, proteomic and functional information and (2) real-time sensor-based measurements motivated by single-cell technology. To assess relationships between protein abundance, transcriptomic and functional data, a nonlinear model was explored at static and temporal levels. The successful integration of these heterogeneous data sources through the stochastic gradient boosted tree approach and its improved predictability are some highlights of this work. Through the development of an innovative validation subroutine based on a permutation approach and the use of external information (i.e., operons), lack of a priori knowledge for undetected proteins was overcome. The integrative methodologies allowed for the identification of undetected proteins for Desulfovibrio vulgaris and Shewanella oneidensis for further biological exploration in laboratories towards finding functional relationships. In an effort to better understand diseases such as cancer at different developmental stages, the Microscale Life Science Center headquartered at the Arizona State University is pursuing single-cell studies by developing novel technologies. This research arranged and applied a statistical framework that tackled the following challenges: random noise, heterogeneous dynamic systems with multiple states, and understanding cell behavior within and across different Barrett's esophageal epithelial cell lines using oxygen consumption curves. These curves were characterized with good empirical fit using nonlinear models with simple structures which allowed extraction of a large number of features. Application of a supervised classification model to these features and the integration of experimental factors allowed for identification of subtle patterns among different cell types visualized through multidimensional scaling. Motivated by the challenges of analyzing real-time measurements, we further explored a unique two-dimensional representation of multiple time series using a wavelet approach which showcased promising results towards less complex approximations. Also, the benefits of external information were explored to improve the image representation.

Contributors

Agent

Created

Date Created
2011

149315-Thumbnail Image.png

Modeling supply chain dynamics with calibrated simulation using data fusion

Description

In today's global market, companies are facing unprecedented levels of uncertainties in supply, demand and in the economic environment. A critical issue for companies to survive increasing competition is to monitor the changing business environment and manage disturbances and changes

In today's global market, companies are facing unprecedented levels of uncertainties in supply, demand and in the economic environment. A critical issue for companies to survive increasing competition is to monitor the changing business environment and manage disturbances and changes in real time. In this dissertation, an integrated framework is proposed using simulation and online calibration methods to enable the adaptive management of large-scale complex supply chain systems. The design, implementation and verification of the integrated approach are studied in this dissertation. The research contributions are two-fold. First, this work enriches symbiotic simulation methodology by proposing a framework of simulation and advanced data fusion methods to improve simulation accuracy. Data fusion techniques optimally calibrate the simulation state/parameters by considering errors in both the simulation models and in measurements of the real-world system. Data fusion methods - Kalman Filtering, Extended Kalman Filtering, and Ensemble Kalman Filtering - are examined and discussed under varied conditions of system chaotic levels, data quality and data availability. Second, the proposed framework is developed, validated and demonstrated in `proof-of-concept' case studies on representative supply chain problems. In the case study of a simplified supply chain system, Kalman Filtering is applied to fuse simulation data and emulation data to effectively improve the accuracy of the detection of abnormalities. In the case study of the `beer game' supply chain model, the system's chaotic level is identified as a key factor to influence simulation performance and the choice of data fusion method. Ensemble Kalman Filtering is found more robust than Extended Kalman Filtering in a highly chaotic system. With appropriate tuning, the improvement of simulation accuracy is up to 80% in a chaotic system, and 60% in a stable system. In the last study, the integrated framework is applied to adaptive inventory control of a multi-echelon supply chain with non-stationary demand. It is worth pointing out that the framework proposed in this dissertation is not only useful in supply chain management, but also suitable to model other complex dynamic systems, such as healthcare delivery systems and energy consumption networks.

Contributors

Agent

Created

Date Created
2010