Matching Items (10)
Filtering by

Clear all filters

152902-Thumbnail Image.png
Description
Accelerated life testing (ALT) is the process of subjecting a product to stress conditions (temperatures, voltage, pressure etc.) in excess of its normal operating levels to accelerate failures. Product failure typically results from multiple stresses acting on it simultaneously. Multi-stress factor ALTs are challenging as they increase the number of

Accelerated life testing (ALT) is the process of subjecting a product to stress conditions (temperatures, voltage, pressure etc.) in excess of its normal operating levels to accelerate failures. Product failure typically results from multiple stresses acting on it simultaneously. Multi-stress factor ALTs are challenging as they increase the number of experiments due to the stress factor-level combinations resulting from the increased number of factors. Chapter 2 provides an approach for designing ALT plans with multiple stresses utilizing Latin hypercube designs that reduces the simulation cost without loss of statistical efficiency. A comparison to full grid and large-sample approximation methods illustrates the approach computational cost gain and flexibility in determining optimal stress settings with less assumptions and more intuitive unit allocations.

Implicit in the design criteria of current ALT designs is the assumption that the form of the acceleration model is correct. This is unrealistic assumption in many real-world problems. Chapter 3 provides an approach for ALT optimum design for model discrimination. We utilize the Hellinger distance measure between predictive distributions. The optimal ALT plan at three stress levels was determined and its performance was compared to good compromise plan, best traditional plan and well-known 4:2:1 compromise test plans. In the case of linear versus quadratic ALT models, the proposed method increased the test plan's ability to distinguish among competing models and provided better guidance as to which model is appropriate for the experiment.

Chapter 4 extends the approach of Chapter 3 to ALT sequential model discrimination. An initial experiment is conducted to provide maximum possible information with respect to model discrimination. The follow-on experiment is planned by leveraging the most current information to allow for Bayesian model comparison through posterior model probability ratios. Results showed that performance of plan is adversely impacted by the amount of censoring in the data, in the case of linear vs. quadratic model form at three levels of constant stress, sequential testing can improve model recovery rate by approximately 8% when data is complete, but no apparent advantage in adopting sequential testing was found in the case of right-censored data when censoring is in excess of a certain amount.
ContributorsNasir, Ehab (Author) / Pan, Rong (Thesis advisor) / Runger, George C. (Committee member) / Gel, Esma (Committee member) / Kao, Ming-Hung (Committee member) / Montgomery, Douglas C. (Committee member) / Arizona State University (Publisher)
Created2014
153053-Thumbnail Image.png
Description
No-confounding designs (NC) in 16 runs for 6, 7, and 8 factors are non-regular fractional factorial designs that have been suggested as attractive alternatives to the regular minimum aberration resolution IV designs because they do not completely confound any two-factor interactions with each other. These designs allow for potential estimation

No-confounding designs (NC) in 16 runs for 6, 7, and 8 factors are non-regular fractional factorial designs that have been suggested as attractive alternatives to the regular minimum aberration resolution IV designs because they do not completely confound any two-factor interactions with each other. These designs allow for potential estimation of main effects and a few two-factor interactions without the need for follow-up experimentation. Analysis methods for non-regular designs is an area of ongoing research, because standard variable selection techniques such as stepwise regression may not always be the best approach. The current work investigates the use of the Dantzig selector for analyzing no-confounding designs. Through a series of examples it shows that this technique is very effective for identifying the set of active factors in no-confounding designs when there are three of four active main effects and up to two active two-factor interactions.

To evaluate the performance of Dantzig selector, a simulation study was conducted and the results based on the percentage of type II errors are analyzed. Also, another alternative for 6 factor NC design, called the Alternate No-confounding design in six factors is introduced in this study. The performance of this Alternate NC design in 6 factors is then evaluated by using Dantzig selector as an analysis method. Lastly, a section is dedicated to comparing the performance of NC-6 and Alternate NC-6 designs.
ContributorsKrishnamoorthy, Archana (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie (Thesis advisor) / Pan, Rong (Committee member) / Arizona State University (Publisher)
Created2014
150555-Thumbnail Image.png
Description
Supply chains are increasingly complex as companies branch out into newer products and markets. In many cases, multiple products with moderate differences in performance and price compete for the same unit of demand. Simultaneous occurrences of multiple scenarios (competitive, disruptive, regulatory, economic, etc.), coupled with business decisions (pricing, product introduction,

Supply chains are increasingly complex as companies branch out into newer products and markets. In many cases, multiple products with moderate differences in performance and price compete for the same unit of demand. Simultaneous occurrences of multiple scenarios (competitive, disruptive, regulatory, economic, etc.), coupled with business decisions (pricing, product introduction, etc.) can drastically change demand structures within a short period of time. Furthermore, product obsolescence and cannibalization are real concerns due to short product life cycles. Analytical tools that can handle this complexity are important to quantify the impact of business scenarios/decisions on supply chain performance. Traditional analysis methods struggle in this environment of large, complex datasets with hundreds of features becoming the norm in supply chains. We present an empirical analysis framework termed Scenario Trees that provides a novel representation for impulse and delayed scenario events and a direction for modeling multivariate constrained responses. Amongst potential learners, supervised learners and feature extraction strategies based on tree-based ensembles are employed to extract the most impactful scenarios and predict their outcome on metrics at different product hierarchies. These models are able to provide accurate predictions in modeling environments characterized by incomplete datasets due to product substitution, missing values, outliers, redundant features, mixed variables and nonlinear interaction effects. Graphical model summaries are generated to aid model understanding. Models in complex environments benefit from feature selection methods that extract non-redundant feature subsets from the data. Additional model simplification can be achieved by extracting specific levels/values that contribute to variable importance. We propose and evaluate new analytical methods to address this problem of feature value selection and study their comparative performance using simulated datasets. We show that supply chain surveillance can be structured as a feature value selection problem. For situations such as new product introduction, a bottom-up approach to scenario analysis is designed using an agent-based simulation and data mining framework. This simulation engine envelopes utility theory, discrete choice models and diffusion theory and acts as a test bed for enacting different business scenarios. We demonstrate the use of machine learning algorithms to analyze scenarios and generate graphical summaries to aid decision making.
ContributorsShinde, Amit (Author) / Runger, George C. (Thesis advisor) / Montgomery, Douglas C. (Committee member) / Villalobos, Rene (Committee member) / Janakiram, Mani (Committee member) / Arizona State University (Publisher)
Created2012
135328-Thumbnail Image.png
Description
Millennials are the group of people that make up the newer generation of the world's population and they are constantly surrounded by technology, as well as known for having different values than the previous generations. Marketers have to adapt to newer ways to appeal to millennials and secure their loyalty

Millennials are the group of people that make up the newer generation of the world's population and they are constantly surrounded by technology, as well as known for having different values than the previous generations. Marketers have to adapt to newer ways to appeal to millennials and secure their loyalty since millennials are always on the lookout for the next best thing and will "trade up for brands that matter, but trade down when brand value is weak", it poses a challenge for the marketing departments of companies (Fromm, J. & Parks, J.). The airline industry is one of the fastest growing sectors as "the total number of people flying on U.S. airlines will increase from 745.5 million in 2014 and grow to 1.15 billion in 2034," which shows that airlines have a wider population to market to, and will need to improve their marketing strategies to differentiate from competitors (Power). The financial sector also has a difficult time reaching out to millennials because "millennials are hesitant to take financial risks," as well as downing in college debt, while not making as much money as previous generations (Fromm, J. & Parks, J.). By looking into the marketing strategies, specifically using social media platforms, of the two industries, an understanding can be gathered of what millennials are attracted to. Along with looking at the marketing strategies of financial and airline industries, I looked at the perspectives of these industries in different countries, which is important to look at because then we can see if the values of millennials vary across different cultures. Countries chosen for research to further examine their cultural differences in terms of marketing practices are the United States and England. The main form of marketing that was used for this research were social media accounts of the companies, and seeing how they used the social networking platforms to reach and engage with their consumers, especially with those of the millennial generation. The companies chosen for further research for the airline industry from England were British Airways, EasyJet, and Virgin Atlantic, while for the U.S. Delta Airlines, Inc., Southwest Airlines, and United were chosen. The companies chosen to further examine within the finance industry from England include Barclay's, HSBC, and Lloyd's Bank, while for the U.S. the banks selected were Bank of America, JPMorgan Chase, and Wells Fargo. The companies for this study were chosen because they are among the top five in their industry, as well as all companies that I have had previous interactions with. It was meant to see what the companies at the top of the industry were doing that set them apart from their competitors in terms of social media marketing content and see if there were features they lacked that could be changed or improvements they could make. A survey was also conducted to get a better idea of the attitudes and behaviors of millennials when it comes to the airline and finance industries, as well as towards social media marketing practices.
ContributorsPathak, Krisha Hemanshu (Author) / Kumar, Ajith (Thesis director) / Arora, Hina (Committee member) / W. P. Carey School of Business (Contributor) / Department of Information Systems (Contributor) / Department of Marketing (Contributor) / Hugh Downs School of Human Communication (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
134871-Thumbnail Image.png
Description
This thesis, through a thorough literature and content review, discusses the various ways that data analytics and supply chain management intersect. Both fields have been around for a while, but are incredibly aided by the information age we live in today. Today's ERP systems and supply chain software packages use

This thesis, through a thorough literature and content review, discusses the various ways that data analytics and supply chain management intersect. Both fields have been around for a while, but are incredibly aided by the information age we live in today. Today's ERP systems and supply chain software packages use advanced analytic techniques and algorithms to optimize every aspect of supply chain management. This includes aspects like inventory optimization, portfolio management, network design, production scheduling, fleet planning, supplier evaluation, and others. The benefit of these analytic techniques is a reduction in costs as well as an improvement in overall supply chain performance and efficiencies. The paper begins with a short historical context on business analytics and optimization then moves on to the impact and application of analytics in the supply chain today. Following that the implications of big data are explored, along with how a company might begin to take advantage of big data and what challenges a firm may face along the way. The current tools used by supply chain professionals are then discussed. There is then a section on the most up and coming technologies; the internet of things, blockchain technology, additive manufacturing (3D printing), and machine learning; and how those technologies may further enable the successful use of analytics to improve supply chain management. Companies that do take advantage of analytics in their supply chains are sure to maintain a competitive advantage over those firms that fail to do so.
ContributorsCotton, Ryan Aaron (Author) / Taylor, Todd (Thesis director) / Arora, Hina (Committee member) / Department of Information Systems (Contributor) / Department of Supply Chain Management (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
Description

Most asteroids originated in larger parent bodies that underwent accretion and heating during the first few million years of the solar system. We investigated the parent body of S-type asteroid 25143 Itokawa by developing a computational model which can approximate the thermal evolution of an early solar system body. We

Most asteroids originated in larger parent bodies that underwent accretion and heating during the first few million years of the solar system. We investigated the parent body of S-type asteroid 25143 Itokawa by developing a computational model which can approximate the thermal evolution of an early solar system body. We compared known constraints on Itokawa’s thermal history to simulations of its parent body and constrained its time of formation to between 1.6 and 2.5 million years after the beginning of the solar system, though certain details could allow for even earlier or later formation. These results stress the importance of precise data required of the material properties of asteroids and meteorites to place better constraints on the histories of their parent bodies. Additional mathematical and computational details are discussed, and the full code and data is made available online.

ContributorsHallstrom, Jonas (Author) / Bose, Maitrayee (Thesis director) / Beckstein, Oliver (Committee member) / Barrett, The Honors College (Contributor) / Department of Physics (Contributor) / Materials Science and Engineering Program (Contributor)
Created2023-05
Description

This creative project develops an environment in which three species inhabit a shared land and models the movement of the creatures to determine the survival rates over time in specific conditions. The three species modelled include a predator and a prey species with movement capabilities as well as a stagnant

This creative project develops an environment in which three species inhabit a shared land and models the movement of the creatures to determine the survival rates over time in specific conditions. The three species modelled include a predator and a prey species with movement capabilities as well as a stagnant fruit species. There are a variety of configurable variables that can be used to modify and control the simulation to observe how the resulting population charts change. The big difference between this project and a normal approach to simulating a predation relationship is that actual creatures themselves are being created and their movement is simulated in this virtual environment which then leads to population counts, rather than integrating differential equations relating the population sizes of both species and purely tracking the populations but not the creatures themselves. Because of this difference, my simulation is not meant to handle all the complexities of life that come in the real-world but instead is intended as a simplified approach to simulating creatures' lives with the purpose of conveying the idea of a real predation relationship. Thus, the main objective of my simulation is to produce data representative of real-world predator-prey relationships, with the overall cyclical pattern that is observed in natural achieved through simulating creature movement and life itself rather than estimating population size change.

ContributorsPerry, Jordan (Author) / Burger, Kevin (Thesis director) / Miller, Phillip (Committee member) / Barrett, The Honors College (Contributor) / Department of Physics (Contributor) / Computer Science and Engineering Program (Contributor)
Created2023-05
Description

We implemented the well-known Ising model in one dimension as a computer program and simulated its behavior with four algorithms: (i) the seminal Metropolis algorithm; (ii) the microcanonical algorithm described by Creutz in 1983; (iii) a variation on Creutz’s time-reversible algorithm allowing for bonds between spins to change dynamically; and

We implemented the well-known Ising model in one dimension as a computer program and simulated its behavior with four algorithms: (i) the seminal Metropolis algorithm; (ii) the microcanonical algorithm described by Creutz in 1983; (iii) a variation on Creutz’s time-reversible algorithm allowing for bonds between spins to change dynamically; and (iv) a combination of the latter two algorithms in a manner reflecting the different timescales on which these two processes occur (“freezing” the bonds in place for part of the simulation). All variations on Creutz’s algorithm were symmetrical in time, and thus reversible. The first three algorithms all favored low-energy states of the spin lattice and generated the Boltzmann energy distribution after reaching thermal equilibrium, as expected, while the last algorithm broke from the Boltzmann distribution while the bonds were “frozen.” The interpretation of this result as a net increase to the system’s total entropy is consistent with the second law of thermodynamics, which leads to the relationship between maximum entropy and the Boltzmann distribution.

ContributorsLewis, Aiden (Author) / Chamberlin, Ralph (Thesis director) / Beckstein, Oliver (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2023-05
131561-Thumbnail Image.png
Description
In this project, we created a code that was able to simulate the dynamics of a three site Hubbard model ring connected to an infinite dissipative bath and driven by an electric field. We utilized the master equation approach, which will one day be able to be implemented efficiently on

In this project, we created a code that was able to simulate the dynamics of a three site Hubbard model ring connected to an infinite dissipative bath and driven by an electric field. We utilized the master equation approach, which will one day be able to be implemented efficiently on a quantum computer. For now we used classical computing to model one of the simplest nontrivial driven dissipative systems. This will serve as a verification of the master equation method and a baseline to test against when we are able to implement it on a quantum computer. For this report, we will mainly focus on classifying the DC component of the current around our ring. We notice several expected characteristics of this DC current including an inverse square tail at large values of the electric field and a linear response region at small values of the electric field.
ContributorsJohnson, Michael (Author) / Chamberlin, Ralph (Thesis director) / Ritchie, Barry (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
165460-Thumbnail Image.png
Description

The goal of this project was to develop a prototype for an educational tool that will help users understand how the voting system deployed by a government can affect the outcomes of elections. This tool was developed in Java SE, consisting of a model for the simulation of elections capable

The goal of this project was to develop a prototype for an educational tool that will help users understand how the voting system deployed by a government can affect the outcomes of elections. This tool was developed in Java SE, consisting of a model for the simulation of elections capable of supporting various voting systems, along with a variety of fairness measures, and educational and explanatory material. While a completed version of this tool would ideally be fully self-contained, easily accessible in-browser, and provide detailed visualizations of the simulated elections, the current prototype version consists of a GitHub repository containing the code, with the educational material and explanations contained within the thesis paper. Ultimately, the goal of this project was to be a stepping stone on the path to create a tool that will instill a measure of systemic skepticism in the user; to give them cause to question why our systems are built the way they are, and reasons to believe that they could be changed for the better. In undertaking this project, I hope to help in providing people with the political education needed to make informed decisions about how they want the government to function. The GitHub repository containing all the code can be found at, https://github.com/SpencerDiamond/Votes_that_Count

ContributorsDiamond, Spencer (Author) / Sarjoughian, Hessam (Thesis director) / Hines, Taylor (Committee member) / Barrett, The Honors College (Contributor) / Department of Physics (Contributor) / Department of English (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2022-05