Matching Items (32)
148215-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsJohnson, Katelyn Rose (Co-author) / Martz, Emma (Co-author) / Chmelnik, Nathan (Co-author) / de Guzman, Lorenzo (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148216-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsChmelnik, Nathan (Co-author) / de Guzman, Lorenzo (Co-author) / Johnson, Katelyn (Co-author) / Martz, Emma (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
147540-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsMartz, Emma Marie (Co-author) / de Guzman, Lorenzo (Co-author) / Johnson, Katelyn (Co-author) / Chmelnik, Nathan (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148263-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

Contributorsde Guzman, Lorenzo (Co-author) / Chmelnik, Nathan (Co-author) / Martz, Emma (Co-author) / Johnson, Katelyn (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / School of Politics and Global Studies (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
161608-Thumbnail Image.png
Description
A production system is commonly restricted by time windows. For example, perishability is a major concern in food processing and requires products, such as yogurt, beer and meat, not to stay in buffer for long. Semiconductor manufacturing is faced with oxidation and moisture absorption issues, if a product in buffer

A production system is commonly restricted by time windows. For example, perishability is a major concern in food processing and requires products, such as yogurt, beer and meat, not to stay in buffer for long. Semiconductor manufacturing is faced with oxidation and moisture absorption issues, if a product in buffer is exposed to air for long. Machine reliability is a major source of uncertainty in production systems that causes residence time constraints to be unsatisfied, leading to potential product quality issues. Rapid advances in sensor technology and automation provide potentials to manage production in real time, but the system complexity, brought by residence time constraints, makes it difficult to optimize system performance while providing a guaranteed product quality. To contribute to this end, this dissertation is dedicated to modeling, analysis and control of production systems with constrained time windows. This study starts with a small-scale serial production line with two machines and one buffer. Even the simplest serial lines could have too large state space due to the consideration of residence time constraints. A Markov chain model is developed to approximately analyze its transient behavior with a high accuracy. An iterative learning algorithm is proposed to perform real-time control. The analysis of two-machine serial line contributes to the further analysis of more general and complex serial lines with multiple machines. Residence time constraints can be required in multiple stages. To deal with it, a two-machine-one-buffer subsystem isolated from a multi-stage serial production line is firstly analyzed and then acts as a building block to support the aggregation method for overall system performance. The proposed aggregation method substantially reduces the complexity of the problem while maintaining a high accuracy. A decomposition-based control approach is proposed to control a multi-stage serial production line. A production system is decomposed into small-scale subsystems, and an iterative aggregation procedure is then used to generate a production control policy. The decomposition-based control approach outperforms general-purpose reinforcement learning method by delivering significant system performance improvement and substantial reduction on computation overhead. Semiconductor assembly line is a typical production system, where products are restricted by time windows and production can be disrupted by machine failures. A production control problem of semiconductor assembly line is presented and studied, and thus total lot delay time and residence time constraint violation are minimized.
ContributorsWang, Feifan (Author) / Ju, Feng (Thesis advisor) / Askin, Ronald (Committee member) / Mirchandani, Pitu (Committee member) / Patel, Nital (Committee member) / Arizona State University (Publisher)
Created2021
171769-Thumbnail Image.png
Description
Electromigration, the net atomic diffusion associated with the momentum transfer from electrons moving through a material, is a major cause of device and component failure in microelectronics. The deleterious effects from electromigration rise with increased current density, a parameter that will only continue to increase as our electronic devices get

Electromigration, the net atomic diffusion associated with the momentum transfer from electrons moving through a material, is a major cause of device and component failure in microelectronics. The deleterious effects from electromigration rise with increased current density, a parameter that will only continue to increase as our electronic devices get smaller and more compact. Understanding the dynamic diffusional pathways and mechanisms of these electromigration-induced and propagated defects can further our attempts at mitigating these failure modes. This dissertation provides insight into the relationships between these defects and parameters of electric field strength, grain boundary misorientation, grain size, void size, eigenstrain, varied atomic mobilities, and microstructure.First, an existing phase-field model was modified to investigate the various defect modes associated with electromigration in an equiaxed non-columnar microstructure. Of specific interest was the effect of grain boundary misalignment with respect to current flow and the mechanisms responsible for the changes in defect kinetics. Grain size, magnitude of externally applied electric field, and the utilization of locally distinct atomic mobilities were other parameters investigated. Networks of randomly distributed grains, a common microstructure of interconnects, were simulated in both 2- and 3-dimensions displaying the effects of 3-D capillarity on diffusional dynamics. Also, a numerical model was developed to study the effect of electromigration on void migration and coalescence. Void migration rates were found to be slowed from compressive forces and the nature of the deformation concurrent with migration was examined through the lens of chemical potential. Void migration was also validated with previously reported theoretical explanations. Void coalescence and void budding were investigated and found to be dependent on the magnitude of interfacial energy and electric field strength. A grasp on the mechanistic pathways of electromigration-induced defect evolution is imperative to the development of reliable electronics, especially as electronic devices continue to miniaturize. This dissertation displays a working understanding of the mechanistic pathways interconnects can fail due to electromigration, as well as provide direction for future research and understanding.
ContributorsFarmer, William McHann (Author) / Ankit, Kumar (Thesis advisor) / Chawla, Nikhilesh (Committee member) / Jiao, Yang (Committee member) / McCue, Ian (Committee member) / Arizona State University (Publisher)
Created2022
190884-Thumbnail Image.png
Description
Existing machine learning and data mining techniques have difficulty in handling three characteristics of real-world data sets altogether in a computationally efficient way: (1) different data types with both categorical data and numeric data, (2) different variable relations in different value ranges of variables, and (3) unknown variable dependency.This dissertation

Existing machine learning and data mining techniques have difficulty in handling three characteristics of real-world data sets altogether in a computationally efficient way: (1) different data types with both categorical data and numeric data, (2) different variable relations in different value ranges of variables, and (3) unknown variable dependency.This dissertation developed a Partial-Value Association Discovery (PVAD) algorithm to overcome the above drawbacks in existing techniques. It also enables the discovery of partial-value and full-value variable associations showing both effects of individual variables and interactive effects of multiple variables. The algorithm is compared with Association rule mining and Decision Tree for validation purposes. The results show that the PVAD algorithm can overcome the shortcomings of existing methods. The second part of this dissertation focuses on knee point detection on noisy data. This extended research topic was inspired during the investigation into categorization for numeric data, which corresponds to Step 1 of the PVAD algorithm. A new mathematical definition of knee point on discrete data is introduced. Due to the unavailability of ground truth data or benchmark data sets, functions used to generate synthetic data are carefully selected and defined. These functions are subsequently employed to create the data sets for this experiment. These synthetic data sets are useful for systematically evaluating and comparing the performance of existing methods. Additionally, a deep-learning model is devised for this problem. Experiments show that the proposed model surpasses existing methods in all synthetic data sets, regardless of whether the samples have single or multiple knee points. The third section presents the application results of the PVAD algorithm to real-world data sets in various domains. These include energy consumption data of an Arizona State University (ASU) building, Computer Network, and ASU Engineering Freshmen Retention. The PVAD algorithm is utilized to create an associative network for energy consumption modeling, analyze univariate and multivariate measures of network flow variables, and identify common and uncommon characteristics related to engineering student retention after their first year at the university. The findings indicate that the PVAD algorithm offers the advantage and capability to uncover variable relationships.
ContributorsFok, Ting Yan (Author) / Ye, Nong (Thesis advisor) / Iquebal, Ashif (Committee member) / Ju, Feng (Committee member) / Collofello, James (Committee member) / Arizona State University (Publisher)
Created2023
171828-Thumbnail Image.png
Description
Solid-state and non-equilibrium processings are of great interest to researchers due to their ability to control and refine bulk and/or surface microstructure of metallic alloys and push them to surpass their conventional properties limit. In this dissertation, solid-state processing i.e., Shear Assisted Processing and Extrusion (ShAPE), and non-equilibrium processes i.e.,

Solid-state and non-equilibrium processings are of great interest to researchers due to their ability to control and refine bulk and/or surface microstructure of metallic alloys and push them to surpass their conventional properties limit. In this dissertation, solid-state processing i.e., Shear Assisted Processing and Extrusion (ShAPE), and non-equilibrium processes i.e., surface mechanical attrition (SMAT) and additive manufacturing (AM) techniques were used to process the magnesium and aluminum alloys respectively. A synergistic investigation of processing-induced microstructural modification and its effect on corrosion resistance was performed using various ex-situ, quasi in-situ, and in-situ electrochemical, microscopy, and spectroscopy characterization techniques. To evaluate the effect of the same processing condition on a range of microstructures, a variety of magnesium alloys such as AZ31B, Mg-3Si, ZK60, and Pure Mg were processed using a novel solid-state processing method, namely ShAPE. It induced a significant grain refinement, homogenized distribution of second phases, and low residual strain in AZ31B alloy, which contributed toward a noble breakdown potential, stable protective film, and hence better corrosion resistance compared to the parent extruded counterpart. However, with variations in composition, volume fraction, and distribution of second phases with Mg-3Si and ZK60 magnesium alloy an opposite response was inferred indicating a strong dependence of corrosion on underlying microstructure compared to a processing condition. Non-equilibrium processes, i.e. SMAT and AM were utilized to process high-strength 7xxx series aluminum alloys. Continuous high energy impacts of hard balls in room temperature (RT SMAT) and liquid nitrogen (LN2 SMAT) flow environment generated a gradient nanocrystalline surface layer with the dissolution of inherent second phase and precipitation of new phases in aluminum 7075 alloys. RT SMAT showed a reduced anodic dissolution rate and improved film resistance, which was attributed to the thicker and composite oxide layer along with new nanoscale precipitates. Lastly, reactive AM was used to process aluminum 7075 and 7050 alloys which resulted in a refined and textureless microstructure. A reduction in corrosion resistance was observed with precipitation of excessive reactive particles (Ti and B4C) in AM alloys compared to wrought counterparts.
ContributorsBeura, Vikrant Kumar (Author) / Solanki, Kiran N (Thesis advisor) / Peralta, Pedro (Committee member) / Alford, Terry (Committee member) / Ankit, Kumar (Committee member) / Joshi, Vineet V (Committee member) / Arizona State University (Publisher)
Created2022
171838-Thumbnail Image.png
Description
Sequential event prediction or sequential pattern mining is a well-studied topic in the literature. There are a lot of real-world scenarios where the data is released sequentially. People believe that there exist repetitive patterns of event sequences so that the future events can be predicted. For example, many companies build

Sequential event prediction or sequential pattern mining is a well-studied topic in the literature. There are a lot of real-world scenarios where the data is released sequentially. People believe that there exist repetitive patterns of event sequences so that the future events can be predicted. For example, many companies build their recommender system to predict the next possible product for the users according to their purchase history. The healthcare system discovers the relationships among patients’ sequential symptoms to mitigate the adverse effect of a treatment (drugs or surgery). Modern engineering systems like aviation/distributed computing/energy systems diagnosed failure event logs and took prompt actions to avoid disaster when a similar failure pattern occurs. In this dissertation, I specifically focus on building a scalable algorithm for event prediction and extraction in the aviation domain. Understanding the accident event is always the major concern of the safety issue in the aviation system. A flight accident is often caused by a sequence of failure events. Accurate modeling of the failure event sequence and how it leads to the final accident is important for aviation safety. This work aims to study the relationship of the failure event sequence and evaluate the risk of the final accident according to these failure events. There are three major challenges I am trying to deal with. (1) Modeling Sequential Events with Hierarchical Structure: I aim to improve the prediction accuracy by taking advantage of the multi-level or hierarchical representation of these rare events. Specifically, I proposed to build a sequential Encoder-Decoder framework with a hierarchical embedding representation of the events. (2) Lack of high-quality and consistent event log data: In order to acquire more accurate event data from aviation accident reports, I convert the problem into a multi-label classification. An attention-based Bidirectional Encoder Representations from Transformers model is developed to achieve good performance and interpretability. (3) Ontology-based event extraction: In order to extract detailed events, I proposed to solve the problem as a hierarchical classification task. I improve the model performance by incorporating event ontology. By solving these three challenges, I provide a framework to extract events from narrative reports and estimate the risk level of aviation accidents through event sequence modeling.
ContributorsZhao, Xinyu (Author) / Yan, Hao (Thesis advisor) / Liu, Yongming (Committee member) / Ju, Feng (Committee member) / Iquebal, Ashif (Committee member) / Arizona State University (Publisher)
Created2022
171633-Thumbnail Image.png
Description
Additive manufacturing consists of successive fabrication of materials layer upon layer to manufacture three-dimensional items. Several key problems such as poor quality of finished products and excessive operational costs are yet to be addressed before it becomes widely applicable in the industry. Retroactive/offline actions such as post-manufacturing inspections for

Additive manufacturing consists of successive fabrication of materials layer upon layer to manufacture three-dimensional items. Several key problems such as poor quality of finished products and excessive operational costs are yet to be addressed before it becomes widely applicable in the industry. Retroactive/offline actions such as post-manufacturing inspections for defect detection in finished products are not only extremely expensive and ineffective but are also incapable of issuing corrective action signals during the building span. In-situ monitoring and optimal control methods, on the other hand, can provide viable alternatives to aid with the online detection of anomalies and control the process. Nevertheless, the complexity of process assumptions, unique structure of collected data, and high-frequency data acquisition rate severely deteriorates the performance of traditional and parametric control and process monitoring approaches. Out of diverse categories of additive manufacturing, Large-Scale Additive Manufacturing (LSAM) by material extrusion and Laser Powder Bed Fusion (LPBF) suffer the most due to their more advanced technologies and are therefore the subjects of study in this work. In LSAM, the geometry of large parts can impact the heat dissipation and lead to large thermal gradients between distance locations on the surface. The surface's temperature profile is captured by an infrared thermal camera and translated to a non-linear regression model to formulate the surface cooling dynamics. The surface temperature prediction methodology is then combined into an optimization model with probabilistic constraints for real-time layer time and material flow control. On-axis optical high-speed cameras can capture streams of melt pool images of laser-powder interaction in real-time during the process. Model-agnostic deep learning methods offer a great deal of flexibility when facing such unstructured big data and thus are appealing alternatives to their physical-related and regression-based modeling counterparts. A configuration of Convolutional Long-Short Term Memory (ConvLSTM) auto-encoder is proposed to learn a deep spatio-temporal representation from sequences of melt pool images collected from experimental builds. The unfolded bottleneck tensors are then further mined to construct a high accuracy and low false alarm rate anomaly detection and monitoring procedure.
ContributorsFathizadan, Sepehr (Author) / Ju, Feng (Thesis advisor) / Wu, Teresa (Committee member) / Lu, Yan (Committee member) / Iquebal, Ashif (Committee member) / Arizona State University (Publisher)
Created2022