Matching Items (54)
Filtering by

Clear all filters

151957-Thumbnail Image.png
Description
Random Forests is a statistical learning method which has been proposed for propensity score estimation models that involve complex interactions, nonlinear relationships, or both of the covariates. In this dissertation I conducted a simulation study to examine the effects of three Random Forests model specifications in propensity score analysis. The

Random Forests is a statistical learning method which has been proposed for propensity score estimation models that involve complex interactions, nonlinear relationships, or both of the covariates. In this dissertation I conducted a simulation study to examine the effects of three Random Forests model specifications in propensity score analysis. The results suggested that, depending on the nature of data, optimal specification of (1) decision rules to select the covariate and its split value in a Classification Tree, (2) the number of covariates randomly sampled for selection, and (3) methods of estimating Random Forests propensity scores could potentially produce an unbiased average treatment effect estimate after propensity scores weighting by the odds adjustment. Compared to the logistic regression estimation model using the true propensity score model, Random Forests had an additional advantage in producing unbiased estimated standard error and correct statistical inference of the average treatment effect. The relationship between the balance on the covariates' means and the bias of average treatment effect estimate was examined both within and between conditions of the simulation. Within conditions, across repeated samples there was no noticeable correlation between the covariates' mean differences and the magnitude of bias of average treatment effect estimate for the covariates that were imbalanced before adjustment. Between conditions, small mean differences of covariates after propensity score adjustment were not sensitive enough to identify the optimal Random Forests model specification for propensity score analysis.
ContributorsCham, Hei Ning (Author) / Tein, Jenn-Yun (Thesis advisor) / Enders, Stephen G (Thesis advisor) / Enders, Craig K. (Committee member) / Mackinnon, David P (Committee member) / Arizona State University (Publisher)
Created2013
152902-Thumbnail Image.png
Description
Accelerated life testing (ALT) is the process of subjecting a product to stress conditions (temperatures, voltage, pressure etc.) in excess of its normal operating levels to accelerate failures. Product failure typically results from multiple stresses acting on it simultaneously. Multi-stress factor ALTs are challenging as they increase the number of

Accelerated life testing (ALT) is the process of subjecting a product to stress conditions (temperatures, voltage, pressure etc.) in excess of its normal operating levels to accelerate failures. Product failure typically results from multiple stresses acting on it simultaneously. Multi-stress factor ALTs are challenging as they increase the number of experiments due to the stress factor-level combinations resulting from the increased number of factors. Chapter 2 provides an approach for designing ALT plans with multiple stresses utilizing Latin hypercube designs that reduces the simulation cost without loss of statistical efficiency. A comparison to full grid and large-sample approximation methods illustrates the approach computational cost gain and flexibility in determining optimal stress settings with less assumptions and more intuitive unit allocations.

Implicit in the design criteria of current ALT designs is the assumption that the form of the acceleration model is correct. This is unrealistic assumption in many real-world problems. Chapter 3 provides an approach for ALT optimum design for model discrimination. We utilize the Hellinger distance measure between predictive distributions. The optimal ALT plan at three stress levels was determined and its performance was compared to good compromise plan, best traditional plan and well-known 4:2:1 compromise test plans. In the case of linear versus quadratic ALT models, the proposed method increased the test plan's ability to distinguish among competing models and provided better guidance as to which model is appropriate for the experiment.

Chapter 4 extends the approach of Chapter 3 to ALT sequential model discrimination. An initial experiment is conducted to provide maximum possible information with respect to model discrimination. The follow-on experiment is planned by leveraging the most current information to allow for Bayesian model comparison through posterior model probability ratios. Results showed that performance of plan is adversely impacted by the amount of censoring in the data, in the case of linear vs. quadratic model form at three levels of constant stress, sequential testing can improve model recovery rate by approximately 8% when data is complete, but no apparent advantage in adopting sequential testing was found in the case of right-censored data when censoring is in excess of a certain amount.
ContributorsNasir, Ehab (Author) / Pan, Rong (Thesis advisor) / Runger, George C. (Committee member) / Gel, Esma (Committee member) / Kao, Ming-Hung (Committee member) / Montgomery, Douglas C. (Committee member) / Arizona State University (Publisher)
Created2014
153053-Thumbnail Image.png
Description
No-confounding designs (NC) in 16 runs for 6, 7, and 8 factors are non-regular fractional factorial designs that have been suggested as attractive alternatives to the regular minimum aberration resolution IV designs because they do not completely confound any two-factor interactions with each other. These designs allow for potential estimation

No-confounding designs (NC) in 16 runs for 6, 7, and 8 factors are non-regular fractional factorial designs that have been suggested as attractive alternatives to the regular minimum aberration resolution IV designs because they do not completely confound any two-factor interactions with each other. These designs allow for potential estimation of main effects and a few two-factor interactions without the need for follow-up experimentation. Analysis methods for non-regular designs is an area of ongoing research, because standard variable selection techniques such as stepwise regression may not always be the best approach. The current work investigates the use of the Dantzig selector for analyzing no-confounding designs. Through a series of examples it shows that this technique is very effective for identifying the set of active factors in no-confounding designs when there are three of four active main effects and up to two active two-factor interactions.

To evaluate the performance of Dantzig selector, a simulation study was conducted and the results based on the percentage of type II errors are analyzed. Also, another alternative for 6 factor NC design, called the Alternate No-confounding design in six factors is introduced in this study. The performance of this Alternate NC design in 6 factors is then evaluated by using Dantzig selector as an analysis method. Lastly, a section is dedicated to comparing the performance of NC-6 and Alternate NC-6 designs.
ContributorsKrishnamoorthy, Archana (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie (Thesis advisor) / Pan, Rong (Committee member) / Arizona State University (Publisher)
Created2014
132898-Thumbnail Image.png
Description
The intention of this report is to use computer simulations to investigate the viability of two materials, water and polyethylene, as shielding against space radiation. First, this thesis discusses some of the challenges facing future and current manned space missions as a result of galactic cosmic radiation, or GCR. The

The intention of this report is to use computer simulations to investigate the viability of two materials, water and polyethylene, as shielding against space radiation. First, this thesis discusses some of the challenges facing future and current manned space missions as a result of galactic cosmic radiation, or GCR. The project then uses MULASSIS, a Geant4 based radiation simulation tool, to analyze the effectiveness of water and polyethylene based radiation shields against proton radiation with an initial energy of 1 GeV. This specific spectrum of radiation is selected because it a component of GCR that has been shown by previous literature to pose a significant threat to humans on board spacecraft. The analysis of each material indicated that both would have to be several meters thick to adequately protect crew against the simulated radiation over a several year mission. Additionally, an analysis of the mass of a simple spacecraft model with different shield thicknesses showed that the mass would increase significantly with internal space. Thus, using either material as a shield would be expensive as a result of the cost of lifting a large amount of mass into orbit.
ContributorsBonfield, Maclain Peter (Author) / Holbert, Keith (Thesis director) / Young, Patrick (Committee member) / Mechanical and Aerospace Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132989-Thumbnail Image.png
Description
Information technology has become an increasing popular major offered by various universities, and provides the student with notable flexibility regarding courses as there are multiple career paths to suit the person’s specific technical interests. However, the methods universities use to promote their majors, including IT, are not as effective

Information technology has become an increasing popular major offered by various universities, and provides the student with notable flexibility regarding courses as there are multiple career paths to suit the person’s specific technical interests. However, the methods universities use to promote their majors, including IT, are not as effective as it needs to be. The mediums currently used are mostly comprised of brochures, flyers, a single web page from the university’s entire website, and some communication with advisors, among others. This can be an issue for many readers as the information is often brief, providing only summaries of what can be expected, and ambiguous statistics that may not accurately or completely reflect the prospects of graduates looking to make a living the rest of their adult lives. This could cause some students to choose a major that may not be their best fit, and changing majors later will be very costly and delay graduation by one or more years. Therefore, the advocation of majors will have be rethought. The IT major offered at ASU is a perfect opportunity to determine whether a major can be promoted through a different approach with a senior capstone project involving a website. This Barrett creative project will act as a subset of the capstone project that will explore an attempt at introducing in interactive element to the website that allows prospective students to get a brief introduction to the computer networking aspect of IT, which includes a real world introductory game, along with more detailed exercises if the user is interested.
ContributorsSchultz, Dillon Maxwell (Author) / Doheny, Damien (Thesis director) / Balyan, Renu (Committee member) / Information Technology (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
Description

To understand the role communication and effective management play in the project management field, virtual work was analyzed in two phases. Phase one consisted of gaining familiarity within the field of project management by interviewing three project managers who discussed their field of work, how it has changed due to

To understand the role communication and effective management play in the project management field, virtual work was analyzed in two phases. Phase one consisted of gaining familiarity within the field of project management by interviewing three project managers who discussed their field of work, how it has changed due to Covid-19, approaches to communication and virtual team management, and strategies that allow for effective project management. Phase two comprised a simulation in which 8 ASU student volunteers were put into scenarios that required completing and executing a given project. Students gained project experience through the simulation and had an opportunity to reflect on their project experience.

ContributorsSandhu, Shiwani K (Author) / Kassing, Jeff (Thesis director) / Pandya, Bankim (Committee member) / College of Integrative Sciences and Arts (Contributor) / School of Social and Behavioral Sciences (Contributor) / Thunderbird School of Global Management (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148419-Thumbnail Image.png
Description

Currently, autonomous vehicles are being evaluated by how well they interact with humans without evaluating how well humans interact with them. Since people are not going to unanimously switch over to using autonomous vehicles, attention must be given to how well these new vehicles signal intent to human drivers from

Currently, autonomous vehicles are being evaluated by how well they interact with humans without evaluating how well humans interact with them. Since people are not going to unanimously switch over to using autonomous vehicles, attention must be given to how well these new vehicles signal intent to human drivers from the driver’s point of view. Ineffective communication will lead to unnecessary discomfort among drivers caused by an underlying uncertainty about what an autonomous vehicle is or isn’t about to do. Recent studies suggest that humans tend to fixate on areas of higher uncertainty so scenarios that have a higher number of vehicle fixations can be reasoned to be more uncertain. We provide a framework for measuring human uncertainty and use the framework to measure the effect of empathetic vs non-empathetic agents. We used a simulated driving environment to create recorded scenarios and manipulate the autonomous vehicle to include either an empathetic or non-empathetic agent. The driving interaction is composed of two vehicles approaching an uncontrolled intersection. These scenarios were played to twelve participants while their gaze was recorded to track what the participants were fixating on. The overall intent was to provide an analytical framework as a tool for evaluating autonomous driving features; and in this case, we choose to evaluate how effective it was for vehicles to have empathetic behaviors included in the autonomous vehicle decision making. A t-test analysis of the gaze indicated that empathy did not in fact reduce uncertainty although additional testing of this hypothesis will be needed due to the small sample size.

ContributorsGreenhagen, Tanner Patrick (Author) / Yang, Yezhou (Thesis director) / Jammula, Varun C (Committee member) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148437-Thumbnail Image.png
Description

A novel CFD algorithm called LEAP is currently being developed by the Kasbaoui Research Group (KRG) using the Immersed Boundary Method (IBM) to describe complex geometries. To validate the algorithm, this research project focused on testing the algorithm in three dimensions by simulating a sphere placed in a moving fluid.

A novel CFD algorithm called LEAP is currently being developed by the Kasbaoui Research Group (KRG) using the Immersed Boundary Method (IBM) to describe complex geometries. To validate the algorithm, this research project focused on testing the algorithm in three dimensions by simulating a sphere placed in a moving fluid. The simulation results were compared against the experimentally derived Schiller-Naumann Correlation. Over the course of 36 trials, various spatial and temporal resolutions were tested at specific Reynolds numbers between 10 and 300. It was observed that numerical errors decreased with increasing spatial and temporal resolution. This result was expected as increased resolution should give results closer to experimental values. Having shown the accuracy and robustness of this method, KRG will continue to develop this algorithm to explore more complex geometries such as aircraft engines or human lungs.

ContributorsMadden, David Jackson (Author) / Kasbaoui, Mohamed Houssem (Thesis director) / Herrmann, Marcus (Committee member) / Mechanical and Aerospace Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148001-Thumbnail Image.png
Description

High-entropy alloys possessing mechanical, chemical, and electrical properties that far exceed those of conventional alloys have the potential to make a significant impact on many areas of engineering. Identifying element combinations and configurations to form these alloys, however, is a difficult, time-consuming, computationally intensive task. Machine learning has revolutionized many

High-entropy alloys possessing mechanical, chemical, and electrical properties that far exceed those of conventional alloys have the potential to make a significant impact on many areas of engineering. Identifying element combinations and configurations to form these alloys, however, is a difficult, time-consuming, computationally intensive task. Machine learning has revolutionized many different fields due to its ability to generalize well to different problems and produce computationally efficient, accurate predictions regarding the system of interest. In this thesis, we demonstrate the effectiveness of machine learning models applied to toy cases representative of simplified physics that are relevant to high-entropy alloy simulation. We show these models are effective at learning nonlinear dynamics for single and multi-particle cases and that more work is needed to accurately represent complex cases in which the system dynamics are chaotic. This thesis serves as a demonstration of the potential benefits of machine learning applied to high-entropy alloy simulations to generate fast, accurate predictions of nonlinear dynamics.

ContributorsDaly, John H (Author) / Ren, Yi (Thesis director) / Zhuang, Houlong (Committee member) / Mechanical and Aerospace Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
147738-Thumbnail Image.png
Description

Covid-19 is unlike any coronavirus we have seen before, characterized mostly by the ease with which it spreads. This analysis utilizes an SEIR model built to accommodate various populations to understand how different testing and infection rates may affect hospitalization and death. This analysis finds that infection rates have a

Covid-19 is unlike any coronavirus we have seen before, characterized mostly by the ease with which it spreads. This analysis utilizes an SEIR model built to accommodate various populations to understand how different testing and infection rates may affect hospitalization and death. This analysis finds that infection rates have a significant impact on Covid-19 impact regardless of the population whereas the impact that testing rates have in this simulation is not as pronounced. Thus, policy-makers should focus on decreasing infection rates through targeted lockdowns and vaccine rollout to contain the virus, and decrease its spread.

Created2021-05