Matching Items (8)
Filtering by

Clear all filters

151957-Thumbnail Image.png
Description
Random Forests is a statistical learning method which has been proposed for propensity score estimation models that involve complex interactions, nonlinear relationships, or both of the covariates. In this dissertation I conducted a simulation study to examine the effects of three Random Forests model specifications in propensity score analysis. The

Random Forests is a statistical learning method which has been proposed for propensity score estimation models that involve complex interactions, nonlinear relationships, or both of the covariates. In this dissertation I conducted a simulation study to examine the effects of three Random Forests model specifications in propensity score analysis. The results suggested that, depending on the nature of data, optimal specification of (1) decision rules to select the covariate and its split value in a Classification Tree, (2) the number of covariates randomly sampled for selection, and (3) methods of estimating Random Forests propensity scores could potentially produce an unbiased average treatment effect estimate after propensity scores weighting by the odds adjustment. Compared to the logistic regression estimation model using the true propensity score model, Random Forests had an additional advantage in producing unbiased estimated standard error and correct statistical inference of the average treatment effect. The relationship between the balance on the covariates' means and the bias of average treatment effect estimate was examined both within and between conditions of the simulation. Within conditions, across repeated samples there was no noticeable correlation between the covariates' mean differences and the magnitude of bias of average treatment effect estimate for the covariates that were imbalanced before adjustment. Between conditions, small mean differences of covariates after propensity score adjustment were not sensitive enough to identify the optimal Random Forests model specification for propensity score analysis.
ContributorsCham, Hei Ning (Author) / Tein, Jenn-Yun (Thesis advisor) / Enders, Stephen G (Thesis advisor) / Enders, Craig K. (Committee member) / Mackinnon, David P (Committee member) / Arizona State University (Publisher)
Created2013
131675-Thumbnail Image.png
Description
The Migration Framework and Simulator is a combination of C# framework / library and Unity simulation tool used for studying basic migration patterns across the US. Users interact with the
Unity simulation tool by implementing political policies or adjusting values via sliders, buttons, etc., which will alter the values in the

The Migration Framework and Simulator is a combination of C# framework / library and Unity simulation tool used for studying basic migration patterns across the US. Users interact with the
Unity simulation tool by implementing political policies or adjusting values via sliders, buttons, etc., which will alter the values in the framework. The user can then use the simulation interface to view different estimated population values for categories of people, such as regional differences, education levels, and more.
ContributorsLarsen, Joseph (Co-author) / Spangler, Braydon (Co-author) / Kobayashi, Yoshihiro (Thesis director) / Nelson, Brian (Committee member) / Computing and Informatics Program (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
132302-Thumbnail Image.png
Description
The instruction of students in computer science concepts can be enhanced by creating programmable simulations and games. ASU VIPLE, which is a framework used to control simulations, robots, and for IoT applications, can be used as an educational tool. Further, the Unity engine allows the creation of 2D and 3D

The instruction of students in computer science concepts can be enhanced by creating programmable simulations and games. ASU VIPLE, which is a framework used to control simulations, robots, and for IoT applications, can be used as an educational tool. Further, the Unity engine allows the creation of 2D and 3D games. The development of basic minigames in Unity can provide simulations for students to program. One can run the Unity minigame and their corresponding VIPLE script to control them over a network connection as well as locally. The minigames conform to the robot output and robot input interfaces supported by VIPLE. With this goal in mind, a snake game, a space shooter game, and a runner game have been created as Unity simulations, which can be controlled by scripts made using VIPLE. These games represent simulated environments that, with movement output and sensor input, students can program simply and externally from VIPLE to help learn robotics and computer science principles.
ContributorsChristensen, Collin Riley (Author) / Chen, Yinong (Thesis director) / Kobayashi, Yoshihiro (Committee member) / Computer Science and Engineering Program (Contributor) / Computing and Informatics Program (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
Description

NASA has partnered with multiple colleges, including ASU, on a mission to study an asteroid called Psyche. Psyche is the first asteroid discovered made of metal, mostly iron, that is close enough for us to study and could give insight into what Earth’s core is like. The mission plans and

NASA has partnered with multiple colleges, including ASU, on a mission to study an asteroid called Psyche. Psyche is the first asteroid discovered made of metal, mostly iron, that is close enough for us to study and could give insight into what Earth’s core is like. The mission plans and research documents on how the various measurement tools work are not engaging to those without a background in STEM. This serves as inspiration to make a web-based game in order to make the information more engaging to the player. This web-based game will take the user through the Psyche mission going from the assembly of the measurement tools all the way to when the satellite is orbiting the asteroid. The creative project consisted of creating a simulation for a young audience, between ages 10 and 18, to experience what the mission could look like once the satellite is at the Psyche asteroid and what the data collected could mean. The asteroid could have been formed through a process called the dynamo process or it could be a piece of a larger parent body. It could be made mostly of metal or silicates, which will be determined during the mission. These are some of the results that will be generalized and relayed to the player. This creative project includes the four main sections of the orbit phase of the mission in which the users will perform tasks to collect some data in order to see some of the generalized possible results of the study of Psyche. Some of the data collected would be the amount of metal making up the asteroid and figuring out what the gravitational pull is. The first main section will use the magnetometer, the second section will use the multispectral imager, the third section will use X-Band Radio Waves, and the fourth section will use the gamma ray and neutron spectrometer.

ContributorsOgar, Scott (Author) / Carter, Lynn (Thesis director) / Chavez-Echeagaray, Maria Elena (Committee member) / Barrett, The Honors College (Contributor) / Computing and Informatics Program (Contributor) / Computer Science and Engineering Program (Contributor)
Created2023-05
Description

The process of learning a new skill can be time consuming and difficult for both the teacher and the student, especially when it comes to computer modeling. With so many terms and functionalities to familiarize oneself with, this task can be overwhelming to even the most knowledgeable student. The purpose

The process of learning a new skill can be time consuming and difficult for both the teacher and the student, especially when it comes to computer modeling. With so many terms and functionalities to familiarize oneself with, this task can be overwhelming to even the most knowledgeable student. The purpose of this paper is to describe the methodology used in the creation of a new set of curricula for those attempting to learn how to use the Dynamic Traffic Simulation Package with Multi-Resolution Modeling. The current DLSim curriculum currently relates information via high-concept terms and complicated graphics. The information in this paper aims to provide a streamlined set of curricula for new users of DLSim, including lesson plans and improved infographics.

ContributorsMills, Alexander (Author) / Zhou, Xuesong (Thesis director) / Chen, Yinong (Committee member) / Barrett, The Honors College (Contributor) / Computing and Informatics Program (Contributor) / Computer Science and Engineering Program (Contributor)
Created2022-05
154216-Thumbnail Image.png
Description
The Partition of Variance (POV) method is a simplistic way to identify large sources of variation in manufacturing systems. This method identifies the variance by estimating the variance of the means (between variance) and the means of the variance (within variance). The project shows that the method correctly identifies the

The Partition of Variance (POV) method is a simplistic way to identify large sources of variation in manufacturing systems. This method identifies the variance by estimating the variance of the means (between variance) and the means of the variance (within variance). The project shows that the method correctly identifies the variance source when compared to the ANOVA method. Although the variance estimators deteriorate when varying degrees of non-normality is introduced through simulation; however, the POV method is shown to be a more stable measure of variance in the aggregate. The POV method also provides non-negative, stable estimates for interaction when compared to the ANOVA method. The POV method is shown to be more stable, particularly in low sample size situations. Based on these findings, it is suggested that the POV is not a replacement for more complex analysis methods, but rather, a supplement to them. POV is ideal for preliminary analysis due to the ease of implementation, the simplicity of interpretation, and the lack of dependency on statistical analysis packages or statistical knowledge.
ContributorsLittle, David John (Author) / Borror, Connie (Thesis advisor) / Montgomery, Douglas C. (Committee member) / Broatch, Jennifer (Committee member) / Arizona State University (Publisher)
Created2015
152902-Thumbnail Image.png
Description
Accelerated life testing (ALT) is the process of subjecting a product to stress conditions (temperatures, voltage, pressure etc.) in excess of its normal operating levels to accelerate failures. Product failure typically results from multiple stresses acting on it simultaneously. Multi-stress factor ALTs are challenging as they increase the number of

Accelerated life testing (ALT) is the process of subjecting a product to stress conditions (temperatures, voltage, pressure etc.) in excess of its normal operating levels to accelerate failures. Product failure typically results from multiple stresses acting on it simultaneously. Multi-stress factor ALTs are challenging as they increase the number of experiments due to the stress factor-level combinations resulting from the increased number of factors. Chapter 2 provides an approach for designing ALT plans with multiple stresses utilizing Latin hypercube designs that reduces the simulation cost without loss of statistical efficiency. A comparison to full grid and large-sample approximation methods illustrates the approach computational cost gain and flexibility in determining optimal stress settings with less assumptions and more intuitive unit allocations.

Implicit in the design criteria of current ALT designs is the assumption that the form of the acceleration model is correct. This is unrealistic assumption in many real-world problems. Chapter 3 provides an approach for ALT optimum design for model discrimination. We utilize the Hellinger distance measure between predictive distributions. The optimal ALT plan at three stress levels was determined and its performance was compared to good compromise plan, best traditional plan and well-known 4:2:1 compromise test plans. In the case of linear versus quadratic ALT models, the proposed method increased the test plan's ability to distinguish among competing models and provided better guidance as to which model is appropriate for the experiment.

Chapter 4 extends the approach of Chapter 3 to ALT sequential model discrimination. An initial experiment is conducted to provide maximum possible information with respect to model discrimination. The follow-on experiment is planned by leveraging the most current information to allow for Bayesian model comparison through posterior model probability ratios. Results showed that performance of plan is adversely impacted by the amount of censoring in the data, in the case of linear vs. quadratic model form at three levels of constant stress, sequential testing can improve model recovery rate by approximately 8% when data is complete, but no apparent advantage in adopting sequential testing was found in the case of right-censored data when censoring is in excess of a certain amount.
ContributorsNasir, Ehab (Author) / Pan, Rong (Thesis advisor) / Runger, George C. (Committee member) / Gel, Esma (Committee member) / Kao, Ming-Hung (Committee member) / Montgomery, Douglas C. (Committee member) / Arizona State University (Publisher)
Created2014
153053-Thumbnail Image.png
Description
No-confounding designs (NC) in 16 runs for 6, 7, and 8 factors are non-regular fractional factorial designs that have been suggested as attractive alternatives to the regular minimum aberration resolution IV designs because they do not completely confound any two-factor interactions with each other. These designs allow for potential estimation

No-confounding designs (NC) in 16 runs for 6, 7, and 8 factors are non-regular fractional factorial designs that have been suggested as attractive alternatives to the regular minimum aberration resolution IV designs because they do not completely confound any two-factor interactions with each other. These designs allow for potential estimation of main effects and a few two-factor interactions without the need for follow-up experimentation. Analysis methods for non-regular designs is an area of ongoing research, because standard variable selection techniques such as stepwise regression may not always be the best approach. The current work investigates the use of the Dantzig selector for analyzing no-confounding designs. Through a series of examples it shows that this technique is very effective for identifying the set of active factors in no-confounding designs when there are three of four active main effects and up to two active two-factor interactions.

To evaluate the performance of Dantzig selector, a simulation study was conducted and the results based on the percentage of type II errors are analyzed. Also, another alternative for 6 factor NC design, called the Alternate No-confounding design in six factors is introduced in this study. The performance of this Alternate NC design in 6 factors is then evaluated by using Dantzig selector as an analysis method. Lastly, a section is dedicated to comparing the performance of NC-6 and Alternate NC-6 designs.
ContributorsKrishnamoorthy, Archana (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie (Thesis advisor) / Pan, Rong (Committee member) / Arizona State University (Publisher)
Created2014