Matching Items (15)
Filtering by

Clear all filters

148263-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

Contributorsde Guzman, Lorenzo (Co-author) / Chmelnik, Nathan (Co-author) / Martz, Emma (Co-author) / Johnson, Katelyn (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / School of Politics and Global Studies (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148215-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsJohnson, Katelyn Rose (Co-author) / Martz, Emma (Co-author) / Chmelnik, Nathan (Co-author) / de Guzman, Lorenzo (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148216-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsChmelnik, Nathan (Co-author) / de Guzman, Lorenzo (Co-author) / Johnson, Katelyn (Co-author) / Martz, Emma (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
147540-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsMartz, Emma Marie (Co-author) / de Guzman, Lorenzo (Co-author) / Johnson, Katelyn (Co-author) / Chmelnik, Nathan (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
168304-Thumbnail Image.png
Description
Monitoring a system for deviations from standard or reference behavior is essential for many data-driven tasks. Whether it is monitoring sensor data or the interactions between system elements, such as edges in a path or transactions in a network, the goal is to detect significant changes from a reference. As

Monitoring a system for deviations from standard or reference behavior is essential for many data-driven tasks. Whether it is monitoring sensor data or the interactions between system elements, such as edges in a path or transactions in a network, the goal is to detect significant changes from a reference. As technological advancements allow for more data to be collected from systems, monitoring approaches should evolve to accommodate the greater collection of high-dimensional data and complex system settings. This dissertation introduces system-level models for monitoring tasks characterized by changes in a subset of system components, utilizing component-level information and relationships. A change may only affect a portion of the data or system (partial change). The first three parts of this dissertation present applications and methods for detecting partial changes. The first part introduces a methodology for partial change detection in a simple, univariate setting. Changes are detected with posterior probabilities and statistical mixture models which allow only a fraction of data to change. The second and third parts of this dissertation center around monitoring more complex multivariate systems modeled through networks. The goal is to detect partial changes in the underlying network attributes and topology. The contributions of the second and third parts are two non-parametric system-level monitoring techniques that consider relationships between network elements. The algorithm Supervised Network Monitoring (SNetM) leverages Graph Neural Networks and transforms the problem into supervised learning. The other algorithm Supervised Network Monitoring for Partial Temporal Inhomogeneity (SNetMP) generates a network embedding, and then transforms the problem to supervised learning. At the end, both SNetM and SNetMP construct measures and transform them to pseudo-probabilities to be monitored for changes. The last topic addresses predicting and monitoring system-level delays on paths in a transportation/delivery system. For each item, the risk of delay is quantified. Machine learning is used to build a system-level model for delay risk, given the information available (such as environmental conditions) on the edges of a path, which integrates edge models. The outputs can then be used in a system-wide monitoring framework, and items most at risk are identified for potential corrective actions.
ContributorsKasaei Roodsari, Maziar (Author) / Runger, George (Thesis advisor) / Escobedo, Adolfo (Committee member) / Pan, Rong (Committee member) / Shinde, Amit (Committee member) / Arizona State University (Publisher)
Created2021
Description

Lean philosophy is a set of practices aimed at reducing waste in an industry/enterprise. By eliminating the aspects of a system that do not add value, the system process will be able to work continuously in a flow, and as a result have a shorter cycle time. With a shorter

Lean philosophy is a set of practices aimed at reducing waste in an industry/enterprise. By eliminating the aspects of a system that do not add value, the system process will be able to work continuously in a flow, and as a result have a shorter cycle time. With a shorter cycle time, less resources are diminished, and efforts can be properly distributed in order to achieve maximum efficiency. In relation, Six Sigma is a process that aims to reduce the variability of a system, and in turn reduce the number of defects and improve overall quality of a product/process. For this reason, Lean and Six Sigma go hand-in-hand. Cutting out non-value adding steps in a process will increase efficiency and perfecting the steps still in place will improve quality. Both aspects are important when it comes to the success of a business practice. DNASU Plasmid Repository would be a major benefactor of the Lean Six Sigma process. The process of cloning DNA requires great attention to detail and time in order to avoid defects. For instance, any mistake made in the bacteria growth process, such as contamination, will result in a significant amount of time being wasted. In addition, the purification of DNA steps also necessitates vigilant observation since the procedure is highly susceptible to little mistakes that could have big impacts. The goal of this project will be to integrate Lean Six Sigma methodology into the DNASU laboratory. By applying numerous aspects of Lean Six Sigma, the DNA repository will be able to improve its efficiency and quality of processes and obtain its highest rate of success.

ContributorsMorton, Haley (Author) / McCarville, Daniel (Thesis director) / Eyerly, Ann (Committee member) / Taylor, Clayton (Committee member) / Barrett, The Honors College (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor)
Created2023-05
ContributorsMorton, Haley (Author) / McCarville, Daniel (Thesis director) / Eyerly, Ann (Committee member) / Taylor, Clayton (Committee member) / Barrett, The Honors College (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor)
Created2023-05
ContributorsMorton, Haley (Author) / McCarville, Daniel (Thesis director) / Eyerly, Ann (Committee member) / Taylor, Clayton (Committee member) / Barrett, The Honors College (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor)
Created2023-05
Description

This paper analyzes the impact of the December 2022 winter storm on Southwest Airlines (SWA). The storm caused delays and cancellations for all airlines, but SWA was the only major airline that was unable to recover fully. The disruption was unique due to the higher volume of people traveling during

This paper analyzes the impact of the December 2022 winter storm on Southwest Airlines (SWA). The storm caused delays and cancellations for all airlines, but SWA was the only major airline that was unable to recover fully. The disruption was unique due to the higher volume of people traveling during the holiday season and the lack of good alternative transportation for stranded passengers. The paper explains SWA's point-to-point (PTP) model, which allows them to offer competitive ticket prices, and organizational factors that have helped them hold a significant market share. The paper also discusses previous failures of SWA's IT and aircraft maintenance management systems and the outdated crewing system, which were not addressed until after the storm. The paper uses AnyLogic agent based modeling to investigate why SWA was so affected and why it took them so long to recover.

ContributorsBray, Mariana (Author) / McCarville, Daniel (Thesis director) / Kucukozyigit, Ali (Committee member) / Barrett, The Honors College (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2023-05
171393-Thumbnail Image.png
Description
The rank aggregation problem has ubiquitous applications in operations research, artificial intelligence, computational social choice, and various other fields. Generally, rank aggregation is utilized whenever a set of judges (human or non-human) express their preferences over a set of items, and it is necessary to find a consensus ranking that

The rank aggregation problem has ubiquitous applications in operations research, artificial intelligence, computational social choice, and various other fields. Generally, rank aggregation is utilized whenever a set of judges (human or non-human) express their preferences over a set of items, and it is necessary to find a consensus ranking that best represents these preferences collectively. Many real-world instances of this problem involve a very large number of items, include ties, and/or contain partial information, which brings a challenge to decision-makers. This work makes several contributions to overcoming these challenges. Most attention on this problem has focused on an NP-hard distance-based variant known as Kemeny aggregation, for which solution approaches with provable guarantees that can handle difficult large-scale instances remain elusive. Firstly, this work introduces exact and approximate methodologies inspired by the social choice foundations of the problem, namely the Condorcet criterion, to decompose the problem. To deal with instances where exact partitioning does not yield many subsets, it proposes Approximate Condorcet Partitioning, which is a scalable solution technique capable of handling large-scale instances while providing provable guarantees. Secondly, this work delves into the rank aggregation problem under the generalized Kendall-tau distance, which contains Kemeny aggregation as a special case. This new problem provides a robust and highly-flexible framework for handling ties. First, it derives exact and heuristic solution methods for the generalized problem. Second, it introduces a novel social choice property that encloses existing variations of the Condorcet criterion as special cases. Thirdly, this work focuses on top-k list aggregation. Top-k lists are a special form of item orderings wherein out of n total items only a small number of them, k, are explicitly ordered. Top-k lists are being increasingly utilized in various fields including recommendation systems, information retrieval, and machine learning. This work introduces exact and inexact methods for consolidating a collection of heterogeneous top- lists. Furthermore, the strength of the proposed exact formulations is analyzed from a polyhedral point of view. Finally, this work identifies the top-100 U.S. universities by consolidating four prominent university rankings to assess the computational implications of this problem.
ContributorsAkbari, Sina (Author) / Escobedo, Adolfo (Thesis advisor) / Byeon, Geunyeong (Committee member) / Sefair, Jorge (Committee member) / Wu, Shin-Yi (Committee member) / Arizona State University (Publisher)
Created2022