Matching Items (9)
152382-Thumbnail Image.png
Description
A P-value based method is proposed for statistical monitoring of various types of profiles in phase II. The performance of the proposed method is evaluated by the average run length criterion under various shifts in the intercept, slope and error standard deviation of the model. In our proposed approach, P-values

A P-value based method is proposed for statistical monitoring of various types of profiles in phase II. The performance of the proposed method is evaluated by the average run length criterion under various shifts in the intercept, slope and error standard deviation of the model. In our proposed approach, P-values are computed at each level within a sample. If at least one of the P-values is less than a pre-specified significance level, the chart signals out-of-control. The primary advantage of our approach is that only one control chart is required to monitor several parameters simultaneously: the intercept, slope(s), and the error standard deviation. A comprehensive comparison of the proposed method and the existing KMW-Shewhart method for monitoring linear profiles is conducted. In addition, the effect that the number of observations within a sample has on the performance of the proposed method is investigated. The proposed method was also compared to the T^2 method discussed in Kang and Albin (2000) for multivariate, polynomial, and nonlinear profiles. A simulation study shows that overall the proposed P-value method performs satisfactorily for different profile types.
ContributorsAdibi, Azadeh (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie (Thesis advisor) / Li, Jing (Committee member) / Zhang, Muhong (Committee member) / Arizona State University (Publisher)
Created2013
152768-Thumbnail Image.png
Description
In a healthcare setting, the Sterile Processing Department (SPD) provides ancillary services to the Operating Room (OR), Emergency Room, Labor & Delivery, and off-site clinics. SPD's function is to reprocess reusable surgical instruments and return them to their home departments. The management of surgical instruments and medical devices can impact

In a healthcare setting, the Sterile Processing Department (SPD) provides ancillary services to the Operating Room (OR), Emergency Room, Labor & Delivery, and off-site clinics. SPD's function is to reprocess reusable surgical instruments and return them to their home departments. The management of surgical instruments and medical devices can impact patient safety and hospital revenue. Any time instrumentation or devices are not available or are not fit for use, patient safety and revenue can be negatively impacted. One step of the instrument reprocessing cycle is sterilization. Steam sterilization is the sterilization method used for the majority of surgical instruments and is preferred to immediate use steam sterilization (IUSS) because terminally sterilized items can be stored until needed. IUSS Items must be used promptly and cannot be stored for later use. IUSS is intended for emergency situations and not as regular course of action. Unfortunately, IUSS is used to compensate for inadequate inventory levels, scheduling conflicts, and miscommunications. If IUSS is viewed as an adverse event, then monitoring IUSS incidences can help healthcare organizations meet patient safety goals and financial goals along with aiding in process improvement efforts. This work recommends statistical process control methods to IUSS incidents and illustrates the use of control charts for IUSS occurrences through a case study and analysis of the control charts for data from a health care provider. Furthermore, this work considers the application of data mining methods to IUSS occurrences and presents a representative example of data mining to the IUSS occurrences. This extends the application of statistical process control and data mining in healthcare applications.
ContributorsWeart, Gail (Author) / Runger, George C. (Thesis advisor) / Li, Jing (Committee member) / Shunk, Dan (Committee member) / Arizona State University (Publisher)
Created2014
149993-Thumbnail Image.png
Description
Many products undergo several stages of testing ranging from tests on individual components to end-item tests. Additionally, these products may be further "tested" via customer or field use. The later failure of a delivered product may in some cases be due to circumstances that have no correlation with the product's

Many products undergo several stages of testing ranging from tests on individual components to end-item tests. Additionally, these products may be further "tested" via customer or field use. The later failure of a delivered product may in some cases be due to circumstances that have no correlation with the product's inherent quality. However, at times, there may be cues in the upstream test data that, if detected, could serve to predict the likelihood of downstream failure or performance degradation induced by product use or environmental stresses. This study explores the use of downstream factory test data or product field reliability data to infer data mining or pattern recognition criteria onto manufacturing process or upstream test data by means of support vector machines (SVM) in order to provide reliability prediction models. In concert with a risk/benefit analysis, these models can be utilized to drive improvement of the product or, at least, via screening to improve the reliability of the product delivered to the customer. Such models can be used to aid in reliability risk assessment based on detectable correlations between the product test performance and the sources of supply, test stands, or other factors related to product manufacture. As an enhancement to the usefulness of the SVM or hyperplane classifier within this context, L-moments and the Western Electric Company (WECO) Rules are used to augment or replace the native process or test data used as inputs to the classifier. As part of this research, a generalizable binary classification methodology was developed that can be used to design and implement predictors of end-item field failure or downstream product performance based on upstream test data that may be composed of single-parameter, time-series, or multivariate real-valued data. Additionally, the methodology provides input parameter weighting factors that have proved useful in failure analysis and root cause investigations as indicators of which of several upstream product parameters have the greater influence on the downstream failure outcomes.
ContributorsMosley, James (Author) / Morrell, Darryl (Committee member) / Cochran, Douglas (Committee member) / Papandreou-Suppappola, Antonia (Committee member) / Roberts, Chell (Committee member) / Spanias, Andreas (Committee member) / Arizona State University (Publisher)
Created2011
153604-Thumbnail Image.png
Description
The complexity of supply chains (SC) has grown rapidly in recent years, resulting in an increased difficulty to evaluate and visualize performance. Consequently, analytical approaches to evaluate SC performance in near real time relative to targets and plans are important to detect and react to deviations in order to prevent

The complexity of supply chains (SC) has grown rapidly in recent years, resulting in an increased difficulty to evaluate and visualize performance. Consequently, analytical approaches to evaluate SC performance in near real time relative to targets and plans are important to detect and react to deviations in order to prevent major disruptions.

Manufacturing anomalies, inaccurate forecasts, and other problems can lead to SC disruptions. Traditional monitoring methods are not sufficient in this respect, because com- plex SCs feature changes in manufacturing tasks (dynamic complexity) and carry a large number of stock keeping units (detail complexity). Problems are easily confounded with normal system variations.

Motivated by these real challenges faced by modern SC, new surveillance solutions are proposed to detect system deviations that could lead to disruptions in a complex SC. To address supply-side deviations, the fitness of different statistics that can be extracted from the enterprise resource planning system is evaluated. A monitoring strategy is first proposed for SCs featuring high levels of dynamic complexity. This presents an opportunity for monitoring methods to be applied in a new, rich domain of SC management. Then a monitoring strategy, called Heat Map Contrasts (HMC), which converts monitoring into a series of classification problems, is used to monitor SCs with both high levels of dynamic and detail complexities. Data from a semiconductor SC simulator are used to compare the methods with other alternatives under various failure cases, and the results illustrate the viability of our methods.

To address demand-side deviations, a new method of quantifying forecast uncer- tainties using the progression of forecast updates is presented. It is illustrated that a rich amount of information is available in rolling horizon forecasts. Two proactive indicators of future forecast errors are extracted from the forecast stream. This quantitative method re- quires no knowledge of the forecasting model itself and has shown promising results when applied to two datasets consisting of real forecast updates.
ContributorsLiu, Lei (Author) / Runger, George C. (Thesis advisor) / Gel, Esma (Committee member) / Pan, Rong (Committee member) / Janakiram, Mani (Committee member) / Arizona State University (Publisher)
Created2015
154099-Thumbnail Image.png
Description
Transfer learning refers to statistical machine learning methods that integrate the knowledge of one domain (source domain) and the data of another domain (target domain) in an appropriate way, in order to develop a model for the target domain that is better than a model using the data of the

Transfer learning refers to statistical machine learning methods that integrate the knowledge of one domain (source domain) and the data of another domain (target domain) in an appropriate way, in order to develop a model for the target domain that is better than a model using the data of the target domain alone. Transfer learning emerged because classic machine learning, when used to model different domains, has to take on one of two mechanical approaches. That is, it will either assume the data distributions of the different domains to be the same and thereby developing one model that fits all, or develop one model for each domain independently. Transfer learning, on the other hand, aims to mitigate the limitations of the two approaches by accounting for both the similarity and specificity of related domains. The objective of my dissertation research is to develop new transfer learning methods and demonstrate the utility of the methods in real-world applications. Specifically, in my methodological development, I focus on two different transfer learning scenarios: spatial transfer learning across different domains and temporal transfer learning along time in the same domain. Furthermore, I apply the proposed spatial transfer learning approach to modeling of degenerate biological systems.Degeneracy is a well-known characteristic, widely-existing in many biological systems, and contributes to the heterogeneity, complexity, and robustness of biological systems. In particular, I study the application of one degenerate biological system which is to use transcription factor (TF) binding sites to predict gene expression across multiple cell lines. Also, I apply the proposed temporal transfer learning approach to change detection of dynamic network data. Change detection is a classic research area in Statistical Process Control (SPC), but change detection in network data has been limited studied. I integrate the temporal transfer learning method called the Network State Space Model (NSSM) and SPC and formulate the problem of change detection from dynamic networks into a covariance monitoring problem. I demonstrate the performance of the NSSM in change detection of dynamic social networks.
ContributorsZou, Na (Author) / Li, Jing (Thesis advisor) / Baydogan, Mustafa (Committee member) / Borror, Connie (Committee member) / Montgomery, Douglas C. (Committee member) / Wu, Teresa (Committee member) / Arizona State University (Publisher)
Created2015
149475-Thumbnail Image.png
Description
The emergence of new technologies as well as a fresh look at analyzing existing processes have given rise to a new type of response characteristic, known as a profile. Profiles are useful when a quality variable is functionally dependent on one or more explanatory, or independent, variables. So, instead of

The emergence of new technologies as well as a fresh look at analyzing existing processes have given rise to a new type of response characteristic, known as a profile. Profiles are useful when a quality variable is functionally dependent on one or more explanatory, or independent, variables. So, instead of observing a single measurement on each unit or product a set of values is obtained over a range which, when plotted, takes the shape of a curve. Traditional multivariate monitoring schemes are inadequate for monitoring profiles due to high dimensionality and poor use of the information stored in functional form leading to very large variance-covariance matrices. Profile monitoring has become an important area of study in statistical process control and is being actively addressed by researchers across the globe. This research explores the understanding of the area in three parts. A comparative analysis is conducted of two linear profile-monitoring techniques based on probability of false alarm rate and average run length (ARL) under shifts in the model parameters. The two techniques studied are control chart based on classical calibration statistic and a control chart based on the parameters of a linear model. The research demonstrates that a profile characterized by a parametric model is more efficient monitoring scheme than one based on monitoring only the individual features of the profile. A likelihood ratio based changepoint control chart is proposed for detecting a sustained step shift in low order polynomial profiles. The test statistic is plotted on a Shewhart like chart with control limits derived from asymptotic distribution theory. The statistic is factored to reflect the variation due to the parameters in to aid in interpreting an out of control signal. The research also looks at the robust parameter design study of profiles, also referred to as signal response systems. Such experiments are often necessary for understanding and reducing the common cause variation in systems. A split-plot approach is proposed to analyze the profiles. It is demonstrated that an explicit modeling of variance components using generalized linear mixed models approach has more precise point estimates and tighter confidence intervals.
ContributorsGupta, Shilpa (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie M. (Thesis advisor) / Fowler, John (Committee member) / Prewitt, Kathy (Committee member) / Kulahci, Murat (Committee member) / Arizona State University (Publisher)
Created2010
149476-Thumbnail Image.png
Description
In mixture-process variable experiments, it is common that the number of runs is greater than in mixture-only or process-variable experiments. These experiments have to estimate the parameters from the mixture components, process variables, and interactions of both variables. In some of these experiments there are variables that are hard to

In mixture-process variable experiments, it is common that the number of runs is greater than in mixture-only or process-variable experiments. These experiments have to estimate the parameters from the mixture components, process variables, and interactions of both variables. In some of these experiments there are variables that are hard to change or cannot be controlled under normal operating conditions. These situations often prohibit a complete randomization for the experimental runs due to practical and economical considerations. Furthermore, the process variables can be categorized into two types: variables that are controllable and directly affect the response, and variables that are uncontrollable and primarily affect the variability of the response. These uncontrollable variables are called noise factors and assumed controllable in a laboratory environment for the purpose of conducting experiments. The model containing both noise variables and control factors can be used to determine factor settings for the control factor that makes the response "robust" to the variability transmitted from the noise factors. These types of experiments can be analyzed in a model for the mean response and a model for the slope of the response within a split-plot structure. When considering the experimental designs, low prediction variances for the mean and slope model are desirable. The methods for the mixture-process variable designs with noise variables considering a restricted randomization are demonstrated and some mixture-process variable designs that are robust to the coefficients of interaction with noise variables are evaluated using fraction design space plots with the respect to the prediction variance properties. Finally, the G-optimal design that minimizes the maximum prediction variance over the entire design region is created using a genetic algorithm.
ContributorsCho, Tae Yeon (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie M. (Thesis advisor) / Shunk, Dan L. (Committee member) / Gel, Esma S (Committee member) / Kulahci, Murat (Committee member) / Arizona State University (Publisher)
Created2010
141315-Thumbnail Image.png
Description

The majority of trust research has focused on the benefits trust can have for individual actors, institutions, and organizations. This “optimistic bias” is particularly evident in work focused on institutional trust, where concepts such as procedural justice, shared values, and moral responsibility have gained prominence. But trust in institutions may

The majority of trust research has focused on the benefits trust can have for individual actors, institutions, and organizations. This “optimistic bias” is particularly evident in work focused on institutional trust, where concepts such as procedural justice, shared values, and moral responsibility have gained prominence. But trust in institutions may not be exclusively good. We reveal implications for the “dark side” of institutional trust by reviewing relevant theories and empirical research that can contribute to a more holistic understanding. We frame our discussion by suggesting there may be a “Goldilocks principle” of institutional trust, where trust that is too low (typically the focus) or too high (not usually considered by trust researchers) may be problematic. The chapter focuses on the issue of too-high trust and processes through which such too-high trust might emerge. Specifically, excessive trust might result from external, internal, and intersecting external-internal processes. External processes refer to the actions institutions take that affect public trust, while internal processes refer to intrapersonal factors affecting a trustor’s level of trust. We describe how the beneficial psychological and behavioral outcomes of trust can be mitigated or circumvented through these processes and highlight the implications of a “darkest” side of trust when they intersect. We draw upon research on organizations and legal, governmental, and political systems to demonstrate the dark side of trust in different contexts. The conclusion outlines directions for future research and encourages researchers to consider the ethical nuances of studying how to increase institutional trust.

ContributorsNeal, Tess M.S. (Author) / Shockley, Ellie (Author) / Schilke, Oliver (Author)
Created2016
151757-Thumbnail Image.png
Description
Statistical process control (SPC) and predictive analytics have been used in industrial manufacturing and design, but up until now have not been applied to threshold data of vital sign monitoring in remote care settings. In this study of 20 elders with COPD and/or CHF, extended months of peak flow monitoring

Statistical process control (SPC) and predictive analytics have been used in industrial manufacturing and design, but up until now have not been applied to threshold data of vital sign monitoring in remote care settings. In this study of 20 elders with COPD and/or CHF, extended months of peak flow monitoring (FEV1) using telemedicine are examined to determine when an earlier or later clinical intervention may have been advised. This study demonstrated that SPC may bring less than a 2.0% increase in clinician workload while providing more robust statistically-derived thresholds than clinician-derived thresholds. Using a random K-fold model, FEV1 output was predictably validated to .80 Generalized R-square, demonstrating the adequate learning of a threshold classifier. Disease severity also impacted the model. Forecasting future FEV1 data points is possible with a complex ARIMA (45, 0, 49), but variation and sources of error require tight control. Validation was above average and encouraging for clinician acceptance. These statistical algorithms provide for the patient's own data to drive reduction in variability and, potentially increase clinician efficiency, improve patient outcome, and cost burden to the health care ecosystem.
ContributorsFralick, Celeste (Author) / Muthuswamy, Jitendran (Thesis advisor) / O'Shea, Terrance (Thesis advisor) / LaBelle, Jeffrey (Committee member) / Pizziconi, Vincent (Committee member) / Shea, Kimberly (Committee member) / Arizona State University (Publisher)
Created2013