Matching Items (4)
Filtering by

Clear all filters

148001-Thumbnail Image.png
Description

High-entropy alloys possessing mechanical, chemical, and electrical properties that far exceed those of conventional alloys have the potential to make a significant impact on many areas of engineering. Identifying element combinations and configurations to form these alloys, however, is a difficult, time-consuming, computationally intensive task. Machine learning has revolutionized many

High-entropy alloys possessing mechanical, chemical, and electrical properties that far exceed those of conventional alloys have the potential to make a significant impact on many areas of engineering. Identifying element combinations and configurations to form these alloys, however, is a difficult, time-consuming, computationally intensive task. Machine learning has revolutionized many different fields due to its ability to generalize well to different problems and produce computationally efficient, accurate predictions regarding the system of interest. In this thesis, we demonstrate the effectiveness of machine learning models applied to toy cases representative of simplified physics that are relevant to high-entropy alloy simulation. We show these models are effective at learning nonlinear dynamics for single and multi-particle cases and that more work is needed to accurately represent complex cases in which the system dynamics are chaotic. This thesis serves as a demonstration of the potential benefits of machine learning applied to high-entropy alloy simulations to generate fast, accurate predictions of nonlinear dynamics.

ContributorsDaly, John H (Author) / Ren, Yi (Thesis director) / Zhuang, Houlong (Committee member) / Mechanical and Aerospace Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
Description

In the last two decades, fantasy sports have grown massively in popularity. Fantasy football in particular is the most popular fantasy sport in the United States. People spend hours upon hours every year building, researching, and perfecting their teams to compete with others for money or bragging rights. One problem,

In the last two decades, fantasy sports have grown massively in popularity. Fantasy football in particular is the most popular fantasy sport in the United States. People spend hours upon hours every year building, researching, and perfecting their teams to compete with others for money or bragging rights. One problem, however, is that National Football League (NFL) players are human and will not perform the same as they did last week or last season. Because of this, there is a need to create a machine learning model to help predict when players will have a tough game or when they can perform above average. This report discusses the history and science of fantasy football, gathering large amounts of player data, manipulating the information to create more insightful data points, creating a machine learning model, and how to use this tool in a real-world situation. The initial model created significantly accurate predictions for quarterbacks and running backs but not receivers and tight ends. Improvements significantly increased the accuracy by reducing the mean average error to below one for all positions, resulting in a successful model for all four positions.

ContributorsCase, Spencer (Author) / Johnson, Jarod (Co-author) / Kostelich, Eric (Thesis director) / Zhuang, Houlong (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor) / Mechanical and Aerospace Engineering Program (Contributor)
Created2023-05
168407-Thumbnail Image.png
Description
A Compact Linear Fresnel Reflector (CLFR) is a simple, cost-effective, and scalable option for generating solar power by concentrating the sun rays. To make a most feasible application, design parameters of the CLFR, such as solar concentrator design parameters, receiver design parameters, heat transfer, power block parameters, etc., should be

A Compact Linear Fresnel Reflector (CLFR) is a simple, cost-effective, and scalable option for generating solar power by concentrating the sun rays. To make a most feasible application, design parameters of the CLFR, such as solar concentrator design parameters, receiver design parameters, heat transfer, power block parameters, etc., should be optimized to achieve optimum efficiency. Many researchers have carried out modeling and optimization of CLFR with various numerical or analytical methods. However, often computational time and cost are significant in these existing approaches. This research attempts to address this issue by proposing a novel computational approach with the help of increased computational efficiency and machine learning. The approach consists of two parts: the algorithm and the machine learning model. The algorithm has been created to fulfill the requirement of the Monte Carlo Ray tracing method for CLFR collector simulation, which is a simplified version of the conventional ray-tracing method. For various configurations of the CLFR system, optical losses and optical efficiency are calculated by employing these design parameters, such as the number of mirrors, mirror length, mirror width, space between adjacent mirrors, and orientation angle of the CLFR system. Further, to reduce the computational time, a machine learning method is used to predict the optical efficiency for the various configurations of the CLFR system. This entire method is validated using an existing approach (SolTrace) for the optical losses and optical efficiency of a CLFR system. It is observed that the program requires 6.63 CPU-hours of computational time are required by the program to calculate efficiency. In contrast, the novel machine learning approach took only seconds to predict the optical efficiency with great accuracy. Therefore, this method can be used to optimize a CLFR system based on the location and land configuration with reduced computational time. This will be beneficial for CLFR to be a potential candidate for concentrating solar power option.
ContributorsLunagariya, Shyam (Author) / Phelan, Patrick (Thesis advisor) / Kwon, Beomjin (Committee member) / Zhuang, Houlong (Committee member) / Arizona State University (Publisher)
Created2021
191492-Thumbnail Image.png
Description
Phase-field (PF) models are one of the most powerful tools to simulate microstructural evolution in metallic materials, polymers, and ceramics. However, existing PF approaches rely on rigorous mathematical model development, sophisticated numerical schemes, and high-performance computing for accuracy. Although recently developed surrogate microstructure models employ deep-learning techniques and reconstruction of

Phase-field (PF) models are one of the most powerful tools to simulate microstructural evolution in metallic materials, polymers, and ceramics. However, existing PF approaches rely on rigorous mathematical model development, sophisticated numerical schemes, and high-performance computing for accuracy. Although recently developed surrogate microstructure models employ deep-learning techniques and reconstruction of microstructures from lower-dimensional data, their accuracy is fairly limited as spatio-temporal information is lost in the pursuit of dimensional reduction. Given these limitations, a novel data-driven emulator (DDE) for extrapolation prediction of microstructural evolution is presented, which combines an image-based convolutional and recurrent neural network (CRNN) with tensor decomposition, while leveraging previously obtained PF datasets for training. To assess the robustness of DDE, the emulation sequence and the scaling behavior with phase-field simulations for several noisy initial states are compared. In conclusion, the effectiveness of the microstructure emulation technique is explored in the context of accelerating runtime, along with an emphasis on its trade-off with accuracy.Meanwhile, an interpolation DDE has also been tested, which is based on obtaining a low-dimensional representation of the microstructures via tensor decomposition and subsequently predicting the microstructure evolution in the low-dimensional space using Gaussian process regression (GPR). Once the microstructure predictions are obtained in the low-dimensional space, a hybrid input-output phase retrieval algorithm will be employed to reconstruct the microstructures. As proof of concept, the results on microstructure prediction for spinodal decomposition are presented, although the method itself is agnostic of the material parameters. Results show that GPR-based DDE model are able to predict microstructure evolution sequences that closely resemble the true microstructures (average normalized mean square of 6.78 × 10−7) at time scales half of that employed in obtaining training data. This data-driven microstructure emulator opens new avenues to predict the microstructural evolution by leveraging phase-field simulations and physical experimentation where the time resolution is often quite large due to limited resources and physical constraints, such as the phase coarsening experiments previously performed in microgravity. Future work will also be discussed and demonstrate the intended utilization of these two approaches for 3D microstructure prediction through their combined application.
ContributorsWu, Peichen (Author) / Ankit, Kumar (Thesis advisor) / Iquebal, Ashif (Committee member) / Jiao, Yang (Committee member) / Zhuang, Houlong (Committee member) / Arizona State University (Publisher)
Created2024