Matching Items (54)

129323-Thumbnail Image.png

Simulation-based Bayesian optimal ALT designs for model discrimination

Description

Accelerated life test (ALT) planning in Bayesian framework is studied in this paper with a focus of differentiating competing acceleration models, when there is uncertainty as to whether the relationshi

Accelerated life test (ALT) planning in Bayesian framework is studied in this paper with a focus of differentiating competing acceleration models, when there is uncertainty as to whether the relationship between log mean life and the stress variable is linear or exhibits some curvature. The proposed criterion is based on the Hellinger distance measure between predictive distributions. The optimal stress-factor setup and unit allocation are determined at three stress levels subject to test-lab equipment and test-duration constraints. Optimal designs are validated by their recovery rates, where the true, data-generating, model is selected under the DIC (Deviance Information Criterion) model selection rule, and by comparing their performance with other test plans. Results show that the proposed optimal design method has the advantage of substantially increasing a test plan׳s ability to distinguish among competing ALT models, thus providing better guidance as to which model is appropriate for the follow-on testing phase in the experiment.

Contributors

Agent

Created

Date Created
  • 2015-02-01

158883-Thumbnail Image.png

Analysis Methods for No-Confounding Screening Designs

Description

Nonregular designs are a preferable alternative to regular resolution four designs because they avoid confounding two-factor interactions. As a result nonregular designs can estimate and identify a few active two-factor

Nonregular designs are a preferable alternative to regular resolution four designs because they avoid confounding two-factor interactions. As a result nonregular designs can estimate and identify a few active two-factor interactions. However, due to the sometimes complex alias structure of nonregular designs, standard screening strategies can fail to identify all active effects. In this research, two-level nonregular screening designs with orthogonal main effects will be discussed. By utilizing knowledge of the alias structure, a design based model selection process for analyzing nonregular designs is proposed.

The Aliased Informed Model Selection (AIMS) strategy is a design specific approach that is compared to three generic model selection methods; stepwise regression, least absolute shrinkage and selection operator (LASSO), and the Dantzig selector. The AIMS approach substantially increases the power to detect active main effects and two-factor interactions versus the aforementioned generic methodologies. This research identifies design specific model spaces; sets of models with strong heredity, all estimable, and exhibit no model confounding. These spaces are then used in the AIMS method along with design specific aliasing rules for model selection decisions. Model spaces and alias rules are identified for three designs; 16-run no-confounding 6, 7, and 8-factor designs. The designs are demonstrated with several examples as well as simulations to show the AIMS superiority in model selection.

A final piece of the research provides a method for augmenting no-confounding designs based on a model spaces and maximum average D-efficiency. Several augmented designs are provided for different situations. A final simulation with the augmented designs shows strong results for augmenting four additional runs if time and resources permit.

Contributors

Agent

Created

Date Created
  • 2020

155712-Thumbnail Image.png

Data Analysis and Experimental Design for Accelerated Life Testing with Heterogeneous Group Effects

Description

In accelerated life tests (ALTs), complete randomization is hardly achievable because of economic and engineering constraints. Typical experimental protocols such as subsampling or random blocks in ALTs result in a

In accelerated life tests (ALTs), complete randomization is hardly achievable because of economic and engineering constraints. Typical experimental protocols such as subsampling or random blocks in ALTs result in a grouped structure, which leads to correlated lifetime observations. In this dissertation, generalized linear mixed model (GLMM) approach is proposed to analyze ALT data and find the optimal ALT design with the consideration of heterogeneous group effects.

Two types of ALTs are demonstrated for data analysis. First, constant-stress ALT (CSALT) data with Weibull failure time distribution is modeled by GLMM. The marginal likelihood of observations is approximated by the quadrature rule; and the maximum likelihood (ML) estimation method is applied in iterative fashion to estimate unknown parameters including the variance component of random effect. Secondly, step-stress ALT (SSALT) data with random group effects is analyzed in similar manner but with an assumption of exponentially distributed failure time in each stress step. Two parameter estimation methods, from the frequentist’s and Bayesian points of view, are applied; and they are compared with other traditional models through simulation study and real example of the heterogeneous SSALT data. The proposed random effect model shows superiority in terms of reducing bias and variance in the estimation of life-stress relationship.

The GLMM approach is particularly useful for the optimal experimental design of ALT while taking the random group effects into account. In specific, planning ALTs under nested design structure with random test chamber effects are studied. A greedy two-phased approach shows that different test chamber assignments to stress conditions substantially impact on the estimation of unknown parameters. Then, the D-optimal test plan with two test chambers is constructed by applying the quasi-likelihood approach. Lastly, the optimal ALT planning is expanded for the case of multiple sources of random effects so that the crossed design structure is also considered, along with the nested structure.

Contributors

Agent

Created

Date Created
  • 2017

156337-Thumbnail Image.png

Data-Driven Robust Optimization in Healthcare Applications

Description

Healthcare operations have enjoyed reduced costs, improved patient safety, and

innovation in healthcare policy over a huge variety of applications by tackling prob-

lems via the creation and optimization of descriptive mathematical

Healthcare operations have enjoyed reduced costs, improved patient safety, and

innovation in healthcare policy over a huge variety of applications by tackling prob-

lems via the creation and optimization of descriptive mathematical models to guide

decision-making. Despite these accomplishments, models are stylized representations

of real-world applications, reliant on accurate estimations from historical data to jus-

tify their underlying assumptions. To protect against unreliable estimations which

can adversely affect the decisions generated from applications dependent on fully-

realized models, techniques that are robust against misspecications are utilized while

still making use of incoming data for learning. Hence, new robust techniques are ap-

plied that (1) allow for the decision-maker to express a spectrum of pessimism against

model uncertainties while (2) still utilizing incoming data for learning. Two main ap-

plications are investigated with respect to these goals, the first being a percentile

optimization technique with respect to a multi-class queueing system for application

in hospital Emergency Departments. The second studies the use of robust forecasting

techniques in improving developing countries’ vaccine supply chains via (1) an inno-

vative outside of cold chain policy and (2) a district-managed approach to inventory

control. Both of these research application areas utilize data-driven approaches that

feature learning and pessimism-controlled robustness.

Contributors

Agent

Created

Date Created
  • 2018

157893-Thumbnail Image.png

Maximin designs for event-related fMRI with uncertain error correlation

Description

One of the premier technologies for studying human brain functions is the event-related functional magnetic resonance imaging (fMRI). The main design issue for such experiments is to find the optimal

One of the premier technologies for studying human brain functions is the event-related functional magnetic resonance imaging (fMRI). The main design issue for such experiments is to find the optimal sequence for mental stimuli. This optimal design sequence allows for collecting informative data to make precise statistical inferences about the inner workings of the brain. Unfortunately, this is not an easy task, especially when the error correlation of the response is unknown at the design stage. In the literature, the maximin approach was proposed to tackle this problem. However, this is an expensive and time-consuming method, especially when the correlated noise follows high-order autoregressive models. The main focus of this dissertation is to develop an efficient approach to reduce the amount of the computational resources needed to obtain A-optimal designs for event-related fMRI experiments. One proposed idea is to combine the Kriging approximation method, which is widely used in spatial statistics and computer experiments with a knowledge-based genetic algorithm. Through case studies, a demonstration is made to show that the new search method achieves similar design efficiencies as those attained by the traditional method, but the new method gives a significant reduction in computing time. Another useful strategy is also proposed to find such designs by considering only the boundary points of the parameter space of the correlation parameters. The usefulness of this strategy is also demonstrated via case studies. The first part of this dissertation focuses on finding optimal event-related designs for fMRI with simple trials when each stimulus consists of only one component (e.g., a picture). The study is then extended to the case of compound trials when stimuli of multiple components (e.g., a cue followed by a picture) are considered.

Contributors

Agent

Created

Date Created
  • 2019

158514-Thumbnail Image.png

Capacity Planning, Production and Distribution Scheduling for a Multi-Facility and Multi-Product Supply Chain Network

Description

In today’s rapidly changing world and competitive business environment, firms are challenged to build their production and distribution systems to provide the desired customer service at the lowest possible

In today’s rapidly changing world and competitive business environment, firms are challenged to build their production and distribution systems to provide the desired customer service at the lowest possible cost. Designing an optimal supply chain by optimizing supply chain operations and decisions is key to achieving these goals.

In this research, a capacity planning and production scheduling mathematical model for a multi-facility and multiple product supply chain network with significant capital and labor costs is first proposed. This model considers the key levers of capacity configuration at production plants namely, shifts, run rate, down periods, finished goods inventory management and overtime. It suggests a minimum cost plan for meeting medium range demand forecasts that indicates production and inventory levels at plants by time period, the associated manpower plan and outbound shipments over the planning horizon. This dissertation then investigates two model extensions: production flexibility and pricing. In the first extension, the cost and benefits of investing in production flexibility is studied. In the second extension, product pricing decisions are added to the model for demand shaping taking into account price elasticity of demand.

The research develops methodologies to optimize supply chain operations by determining the optimal capacity plan and optimal flows of products among facilities based on a nonlinear mixed integer programming formulation. For large size real life cases the problem is intractable. An alternate formulation and an iterative heuristic algorithm are proposed and tested. The performance and bounds for the heuristic are evaluated. A real life case study in the automotive industry is considered for the implementation of the proposed models. The implementation results illustrate that the proposed method provides valuable insights for assisting the decision making process in the supply chain and provides significant improvement over current practice.

Contributors

Agent

Created

Date Created
  • 2020

158208-Thumbnail Image.png

Optimal Sampling Designs for Functional Data Analysis

Description

Functional regression models are widely considered in practice. To precisely understand an underlying functional mechanism, a good sampling schedule for collecting informative functional data is necessary, especially when data collection

Functional regression models are widely considered in practice. To precisely understand an underlying functional mechanism, a good sampling schedule for collecting informative functional data is necessary, especially when data collection is limited. However, scarce research has been conducted on the optimal sampling schedule design for the functional regression model so far. To address this design issue, efficient approaches are proposed for generating the best sampling plan in the functional regression setting. First, three optimal experimental designs are considered under a function-on-function linear model: the schedule that maximizes the relative efficiency for recovering the predictor function, the schedule that maximizes the relative efficiency for predicting the response function, and the schedule that maximizes the mixture of the relative efficiencies of both the predictor and response functions. The obtained sampling plan allows a precise recovery of the predictor function and a precise prediction of the response function. The proposed approach can also be reduced to identify the optimal sampling plan for the problem with a scalar-on-function linear regression model. In addition, the optimality criterion on predicting a scalar response using a functional predictor is derived when the quadratic relationship between these two variables is present, and proofs of important properties of the derived optimality criterion are also provided. To find such designs, an algorithm that is comparably fast, and can generate nearly optimal designs is proposed. As the optimality criterion includes quantities that must be estimated from prior knowledge (e.g., a pilot study), the effectiveness of the suggested optimal design highly depends on the quality of the estimates. However, in many situations, the estimates are unreliable; thus, a bootstrap aggregating (bagging) approach is employed for enhancing the quality of estimates and for finding sampling schedules stable to the misspecification of estimates. Through case studies, it is demonstrated that the proposed designs outperform other designs in terms of accurately predicting the response and recovering the predictor. It is also proposed that bagging-enhanced design generates a more robust sampling design under the misspecification of estimated quantities.

Contributors

Agent

Created

Date Created
  • 2020

151341-Thumbnail Image.png

Spatio-temporal data mining to detect changes and clusters in trajectories

Description

With the rapid development of mobile sensing technologies like GPS, RFID, sensors in smartphones, etc., capturing position data in the form of trajectories has become easy. Moving object trajectory analysis

With the rapid development of mobile sensing technologies like GPS, RFID, sensors in smartphones, etc., capturing position data in the form of trajectories has become easy. Moving object trajectory analysis is a growing area of interest these days owing to its applications in various domains such as marketing, security, traffic monitoring and management, etc. To better understand movement behaviors from the raw mobility data, this doctoral work provides analytic models for analyzing trajectory data. As a first contribution, a model is developed to detect changes in trajectories with time. If the taxis moving in a city are viewed as sensors that provide real time information of the traffic in the city, a change in these trajectories with time can reveal that the road network has changed. To detect changes, trajectories are modeled with a Hidden Markov Model (HMM). A modified training algorithm, for parameter estimation in HMM, called m-BaumWelch, is used to develop likelihood estimates under assumed changes and used to detect changes in trajectory data with time. Data from vehicles are used to test the method for change detection. Secondly, sequential pattern mining is used to develop a model to detect changes in frequent patterns occurring in trajectory data. The aim is to answer two questions: Are the frequent patterns still frequent in the new data? If they are frequent, has the time interval distribution in the pattern changed? Two different approaches are considered for change detection, frequency-based approach and distribution-based approach. The methods are illustrated with vehicle trajectory data. Finally, a model is developed for clustering and outlier detection in semantic trajectories. A challenge with clustering semantic trajectories is that both numeric and categorical attributes are present. Another problem to be addressed while clustering is that trajectories can be of different lengths and also have missing values. A tree-based ensemble is used to address these problems. The approach is extended to outlier detection in semantic trajectories.

Contributors

Agent

Created

Date Created
  • 2012

150659-Thumbnail Image.png

Product design optimization under epistemic uncertainty

Description

This dissertation is to address product design optimization including reliability-based design optimization (RBDO) and robust design with epistemic uncertainty. It is divided into four major components as outlined below. Firstly,

This dissertation is to address product design optimization including reliability-based design optimization (RBDO) and robust design with epistemic uncertainty. It is divided into four major components as outlined below. Firstly, a comprehensive study of uncertainties is performed, in which sources of uncertainty are listed, categorized and the impacts are discussed. Epistemic uncertainty is of interest, which is due to lack of knowledge and can be reduced by taking more observations. In particular, the strategies to address epistemic uncertainties due to implicit constraint function are discussed. Secondly, a sequential sampling strategy to improve RBDO under implicit constraint function is developed. In modern engineering design, an RBDO task is often performed by a computer simulation program, which can be treated as a black box, as its analytical function is implicit. An efficient sampling strategy on learning the probabilistic constraint function under the design optimization framework is presented. The method is a sequential experimentation around the approximate most probable point (MPP) at each step of optimization process. It is compared with the methods of MPP-based sampling, lifted surrogate function, and non-sequential random sampling. Thirdly, a particle splitting-based reliability analysis approach is developed in design optimization. In reliability analysis, traditional simulation methods such as Monte Carlo simulation may provide accurate results, but are often accompanied with high computational cost. To increase the efficiency, particle splitting is integrated into RBDO. It is an improvement of subset simulation with multiple particles to enhance the diversity and stability of simulation samples. This method is further extended to address problems with multiple probabilistic constraints and compared with the MPP-based methods. Finally, a reliability-based robust design optimization (RBRDO) framework is provided to integrate the consideration of design reliability and design robustness simultaneously. The quality loss objective in robust design, considered together with the production cost in RBDO, are used formulate a multi-objective optimization problem. With the epistemic uncertainty from implicit performance function, the sequential sampling strategy is extended to RBRDO, and a combined metamodel is proposed to tackle both controllable variables and uncontrollable variables. The solution is a Pareto frontier, compared with a single optimal solution in RBDO.

Contributors

Agent

Created

Date Created
  • 2012

154080-Thumbnail Image.png

Bayesian D-optimal design issues and optimal design construction methods for generalized linear models with random blocks

Description

Optimal experimental design for generalized linear models is often done using a pseudo-Bayesian approach that integrates the design criterion across a prior distribution on the parameter values. This approach

Optimal experimental design for generalized linear models is often done using a pseudo-Bayesian approach that integrates the design criterion across a prior distribution on the parameter values. This approach ignores the lack of utility of certain models contained in the prior, and a case is demonstrated where the heavy focus on such hopeless models results in a design with poor performance and with wild swings in coverage probabilities for Wald-type confidence intervals. Design construction using a utility-based approach is shown to result in much more stable coverage probabilities in the area of greatest concern.

The pseudo-Bayesian approach can be applied to the problem of optimal design construction under dependent observations. Often, correlation between observations exists due to restrictions on randomization. Several techniques for optimal design construction are proposed in the case of the conditional response distribution being a natural exponential family member but with a normally distributed block effect . The reviewed pseudo-Bayesian approach is compared to an approach based on substituting the marginal likelihood with the joint likelihood and an approach based on projections of the score function (often called quasi-likelihood). These approaches are compared for several models with normal, Poisson, and binomial conditional response distributions via the true determinant of the expected Fisher information matrix where the dispersion of the random blocks is considered a nuisance parameter. A case study using the developed methods is performed.

The joint and quasi-likelihood methods are then extended to address the case when the magnitude of random block dispersion is of concern. Again, a simulation study over several models is performed, followed by a case study when the conditional response distribution is a Poisson distribution.

Contributors

Agent

Created

Date Created
  • 2015