Matching Items (6)
Filtering by

Clear all filters

152087-Thumbnail Image.png
Description
Nonregular screening designs can be an economical alternative to traditional resolution IV 2^(k-p) fractional factorials. Recently 16-run nonregular designs, referred to as no-confounding designs, were introduced in the literature. These designs have the property that no pair of main effect (ME) and two-factor interaction (2FI) estimates are completely confounded. In

Nonregular screening designs can be an economical alternative to traditional resolution IV 2^(k-p) fractional factorials. Recently 16-run nonregular designs, referred to as no-confounding designs, were introduced in the literature. These designs have the property that no pair of main effect (ME) and two-factor interaction (2FI) estimates are completely confounded. In this dissertation, orthogonal arrays were evaluated with many popular design-ranking criteria in order to identify optimal 20-run and 24-run no-confounding designs. Monte Carlo simulation was used to empirically assess the model fitting effectiveness of the recommended no-confounding designs. The results of the simulation demonstrated that these new designs, particularly the 24-run designs, are successful at detecting active effects over 95% of the time given sufficient model effect sparsity. The final chapter presents a screening design selection methodology, based on decision trees, to aid in the selection of a screening design from a list of published options. The methodology determines which of a candidate set of screening designs has the lowest expected experimental cost.
ContributorsStone, Brian (Author) / Montgomery, Douglas C. (Thesis advisor) / Silvestrini, Rachel T. (Committee member) / Fowler, John W (Committee member) / Borror, Connie M. (Committee member) / Arizona State University (Publisher)
Created2013
152015-Thumbnail Image.png
Description
This dissertation explores different methodologies for combining two popular design paradigms in the field of computer experiments. Space-filling designs are commonly used in order to ensure that there is good coverage of the design space, but they may not result in good properties when it comes to model fitting. Optimal

This dissertation explores different methodologies for combining two popular design paradigms in the field of computer experiments. Space-filling designs are commonly used in order to ensure that there is good coverage of the design space, but they may not result in good properties when it comes to model fitting. Optimal designs traditionally perform very well in terms of model fitting, particularly when a polynomial is intended, but can result in problematic replication in the case of insignificant factors. By bringing these two design types together, positive properties of each can be retained while mitigating potential weaknesses. Hybrid space-filling designs, generated as Latin hypercubes augmented with I-optimal points, are compared to designs of each contributing component. A second design type called a bridge design is also evaluated, which further integrates the disparate design types. Bridge designs are the result of a Latin hypercube undergoing coordinate exchange to reach constrained D-optimality, ensuring that there is zero replication of factors in any one-dimensional projection. Lastly, bridge designs were augmented with I-optimal points with two goals in mind. Augmentation with candidate points generated assuming the same underlying analysis model serves to reduce the prediction variance without greatly compromising the space-filling property of the design, while augmentation with candidate points generated assuming a different underlying analysis model can greatly reduce the impact of model misspecification during the design phase. Each of these composite designs are compared to pure space-filling and optimal designs. They typically out-perform pure space-filling designs in terms of prediction variance and alphabetic efficiency, while maintaining comparability with pure optimal designs at small sample size. This justifies them as excellent candidates for initial experimentation.
ContributorsKennedy, Kathryn (Author) / Montgomery, Douglas C. (Thesis advisor) / Johnson, Rachel T. (Thesis advisor) / Fowler, John W (Committee member) / Borror, Connie M. (Committee member) / Arizona State University (Publisher)
Created2013
136655-Thumbnail Image.png
Description

The U.S. Navy and other amphibious military organizations utilize a derivation of the traditional side stroke called the Combat Side Stroke, or CSS, and tout it as the most efficient technique available. Citing its low aerobic requirements and slow yet powerful movements as superior to the traditionally-best front crawl (freestyle),

The U.S. Navy and other amphibious military organizations utilize a derivation of the traditional side stroke called the Combat Side Stroke, or CSS, and tout it as the most efficient technique available. Citing its low aerobic requirements and slow yet powerful movements as superior to the traditionally-best front crawl (freestyle), the CSS is the go-to stroke for any operation in the water. The purpose of this thesis is to apply principles of Industrial Engineering to a real-world situation not typically approached from a perspective of optimization. I will analyze pre-existing data about various swim strokes in order to compare them in terms of efficiency for different variables. These variables include calories burned, speed, and strokes per unit distance, as well as their interactions. Calories will be measured by heart rate monitors, converting BPM to calories burned. Speed will be measured by stopwatch and observer. Strokes per unit distance will be measured by observer. The strokes to be analyzed include the breast stroke, crawl stroke, butterfly, and combat side stroke. The goal is to informally test the U.S. Navy's claim that the combat side stroke is the optimum stroke to conserve energy while covering distance. Because of limitations in the scope of the project, analysis will be done using data collected from literary sources rather than through experimentation. This thesis will include a design of experiment to test the findings here in practical study. The main method of analysis will be linear programming, followed by hypothesis testing, culminating in a design of experiment for future progress on this topic.

ContributorsGoodsell, Kevin Lewis (Author) / McCarville, Daniel R. (Thesis director) / Kashiwagi, Jacob (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2014-12
134361-Thumbnail Image.png
Description

Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the

Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the comparison. The study compared multiple test cases run on the three software packages with a focus on 2k and 3K factorial design and adjusting the standard deviation effect size, number of categorical factors, levels, number of factors, and replicates. All six cases were run on all three programs and were attempted to be run at one, two, and three replicates each. There was an issue at the one replicate stage, however—Minitab does not allow for only one replicate full factorial designs and Design Expert will not provide power outputs for only one replicate unless there are three or more factors. From the analysis of these results, it was concluded that the differences between JMP 13 and Design Expert 10 were well within the margin of error and likely caused by rounding. The differences between JMP 13, Design Expert 10, and Minitab 17 on the other hand indicated a fundamental difference in the way Minitab addressed power calculation compared to the latest versions of JMP and Design Expert. This was found to be likely a cause of Minitab’s dummy variable coding as its default instead of the orthogonal coding default of the other two. Although dummy variable and orthogonal coding for factorial designs do not show a difference in results, the methods affect the overall power calculations. All three programs can be adjusted to use either method of coding, but the exact instructions for how are difficult to find and thus a follow-up guide on changing the coding for factorial variables would improve this issue.

ContributorsArmstrong, Julia Robin (Author) / McCarville, Daniel R. (Thesis director) / Montgomery, Douglas (Committee member) / Industrial, Systems (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
Description
In 2010, for the first time in human history, more than half of the world's total population lived in cities; this number is expected to increase to 60% or more by 2050. The goal of this research effort is to create a comprehensive model and modelling framework for megacities, middleweight

In 2010, for the first time in human history, more than half of the world's total population lived in cities; this number is expected to increase to 60% or more by 2050. The goal of this research effort is to create a comprehensive model and modelling framework for megacities, middleweight cities, and urban agglomerations, collectively referred to as dense urban areas. The motivation for this project comes from the United States Army's desire for readiness in all operating environments including dense urban areas. Though there is valuable insight in research to support Army operational behaviors, megacities are of unique interest to nearly every societal sector imaginable. A novel application for determining both main effects and interactive effects between factors within a dense urban area is a Design of Experiments- providing insight on factor causations. Regression Modelling can also be employed for analysis of dense urban areas, providing wide ranging insights into correlations between factors and their interactions. Past studies involving megacities concern themselves with general trend of cities and their operation. This study is unique in its efforts to model a singular megacity to enable decision support for military operational planning, as well as potential decision support to city planners to increase the sustainability of these dense urban areas and megacities.
ContributorsMathesen, Logan Michael (Author) / Zenzen, Frances (Thesis director) / Jennings, Cheryl (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
Recent studies in traumatic brain injury (TBI) have found a temporal window where therapeutics on the nanometer scale can cross the blood-brain barrier and enter the parenchyma. Developing protein-based therapeutics is attractive for a number of reasons, yet, the production pipeline for high yield and consistent bioactive recombinant proteins remains

Recent studies in traumatic brain injury (TBI) have found a temporal window where therapeutics on the nanometer scale can cross the blood-brain barrier and enter the parenchyma. Developing protein-based therapeutics is attractive for a number of reasons, yet, the production pipeline for high yield and consistent bioactive recombinant proteins remains a major obstacle. Previous studies for recombinant protein production has utilized gram-negative hosts such as Escherichia coli (E. coli) due to its well-established genetics and fast growth for recombinant protein production. However, using gram-negative hosts require lysis that calls for additional optimization and also introduces endotoxins and proteases that contribute to protein degradation. This project directly addressed this issue and evaluated the potential to use a gram-positive host such as Brevibacillus choshinensis (Brevi) which does not require lysis as the proteins are expressed directly into the supernatant. This host was utilized to produce variants of Stock 11 (S11) protein as a proof-of-concept towards this methodology. Variants of S11 were synthesized using different restriction enzymes which will alter the location of protein tags that may affect production or purification. Factors such as incubation time, incubation temperature, and media were optimized for each variant of S11 using a robust design of experiments. All variants of S11 were grown using optimized parameters prior to purification via affinity chromatography. Results showed the efficiency of using Brevi as a potential host for domain antibody production in the Stabenfeldt lab. Future aims will focus on troubleshooting the purification process to optimize the protein production pipeline.
ContributorsEmbrador, Glenna Bea Rebano (Author) / Stabenfeldt, Sarah (Thesis director) / Plaisier, Christopher (Committee member) / Harrington Bioengineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2019-05