Matching Items (695)
Filtering by

Clear all filters

148043-Thumbnail Image.png
Description

Automated vehicles are becoming more prevalent in the modern world. Using platoons of automated vehicles can have numerous benefits including increasing the safety of drivers as well as streamlining roadway operations. How individual automated vehicles within a platoon react to each other is essential to creating an efficient method of

Automated vehicles are becoming more prevalent in the modern world. Using platoons of automated vehicles can have numerous benefits including increasing the safety of drivers as well as streamlining roadway operations. How individual automated vehicles within a platoon react to each other is essential to creating an efficient method of travel. This paper looks at two individual vehicles forming a platoon and tracks the time headway between the two. Several speed profiles are explored for the following vehicle including a triangular and trapezoidal speed profile. It is discovered that a safety violation occurs during platoon formation where the desired time headway between the vehicles is violated. The aim of this research is to explore if this violation can be eliminated or reduced through utilization of different speed profiles.

ContributorsLarson, Kurt Gregory (Author) / Lou, Yingyan (Thesis director) / Chen, Yan (Committee member) / Civil, Environmental and Sustainable Eng Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148001-Thumbnail Image.png
Description

High-entropy alloys possessing mechanical, chemical, and electrical properties that far exceed those of conventional alloys have the potential to make a significant impact on many areas of engineering. Identifying element combinations and configurations to form these alloys, however, is a difficult, time-consuming, computationally intensive task. Machine learning has revolutionized many

High-entropy alloys possessing mechanical, chemical, and electrical properties that far exceed those of conventional alloys have the potential to make a significant impact on many areas of engineering. Identifying element combinations and configurations to form these alloys, however, is a difficult, time-consuming, computationally intensive task. Machine learning has revolutionized many different fields due to its ability to generalize well to different problems and produce computationally efficient, accurate predictions regarding the system of interest. In this thesis, we demonstrate the effectiveness of machine learning models applied to toy cases representative of simplified physics that are relevant to high-entropy alloy simulation. We show these models are effective at learning nonlinear dynamics for single and multi-particle cases and that more work is needed to accurately represent complex cases in which the system dynamics are chaotic. This thesis serves as a demonstration of the potential benefits of machine learning applied to high-entropy alloy simulations to generate fast, accurate predictions of nonlinear dynamics.

ContributorsDaly, John H (Author) / Ren, Yi (Thesis director) / Zhuang, Houlong (Committee member) / Mechanical and Aerospace Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148065-Thumbnail Image.png
Description

Self-efficacy in engineering, engineering identity, and coping in engineering have been shown in previous studies to be highly important in the advancement of one’s development in the field of engineering. Through the creation and deployment of a 17 question survey, undergraduate and first year masters students were asked to provide

Self-efficacy in engineering, engineering identity, and coping in engineering have been shown in previous studies to be highly important in the advancement of one’s development in the field of engineering. Through the creation and deployment of a 17 question survey, undergraduate and first year masters students were asked to provide information on their engagement at their university, their demographic information, and to rank their level of agreement with 22 statements relating to the aforementioned ideas. Using the results from the collected data, exploratory factor analysis was completed to identify the factors that existed and any correlations. No statistically significant correlations between the identified three factors and demographic or engagement information were found. There needs to be a significant increase in the data sample size for statistically significant results to be found. Additionally, there is future work needed in the creation of an engagement measure that successfully reflects the level and impact of participation in engineering activities beyond traditional coursework.

ContributorsJones, Elizabeth Michelle (Author) / Ganesh, Tirupalavanam (Thesis director) / Graham, Kaely (Committee member) / Electrical Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148195-Thumbnail Image.png
Description

The colossal global counterfeit market and advances in cryptography including quantum computing supremacy have led the drive for a class of anti-counterfeit tags that are physically unclonable. Dendrites, previously considered an undesirable side effect of battery operation, have promise as an extremely versatile version of such tags, with their fundamental

The colossal global counterfeit market and advances in cryptography including quantum computing supremacy have led the drive for a class of anti-counterfeit tags that are physically unclonable. Dendrites, previously considered an undesirable side effect of battery operation, have promise as an extremely versatile version of such tags, with their fundamental nature ensuring that no two dendrites are alike and that they can be read at multiple magnification scales. In this work, we first pursue a simulation for electrochemical dendrites that elucidates fundamental information about their growth mechanism. We then translate these results into physical dendrites and demonstrate methods of producing a hash from these dendrites that is damage-tolerant for real-world verification. Finally, we explore theoretical curiosities that arise from the fractal nature of dendrites. We find that uniquely ramified dendrites, which rely on lower ion mobility and conductive deposition, are particularly amenable to wavelet hashing, and demonstrate that these dendrites have strong commercial potential for securing supply chains at the highest level while maintaining a low price point.

ContributorsSneh, Tal (Author) / Kozicki, Michael (Thesis director) / Gonzalez-Velo, Yago (Committee member) / School of Molecular Sciences (Contributor) / Department of Physics (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148215-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsJohnson, Katelyn Rose (Co-author) / Martz, Emma (Co-author) / Chmelnik, Nathan (Co-author) / de Guzman, Lorenzo (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148216-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsChmelnik, Nathan (Co-author) / de Guzman, Lorenzo (Co-author) / Johnson, Katelyn (Co-author) / Martz, Emma (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
149867-Thumbnail Image.png
Description
Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding

Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding an auditory model in the objective function formulation and proposes possible solutions to overcome high complexity issues for use in real-time speech/audio algorithms. Specific problems addressed in this dissertation include: 1) the development of approximate but computationally efficient auditory model implementations that are consistent with the principles of psychoacoustics, 2) the development of a mapping scheme that allows synthesizing a time/frequency domain representation from its equivalent auditory model output. The first problem is aimed at addressing the high computational complexity involved in solving perceptual objective functions that require repeated application of auditory model for evaluation of different candidate solutions. In this dissertation, a frequency pruning and a detector pruning algorithm is developed that efficiently implements the various auditory model stages. The performance of the pruned model is compared to that of the original auditory model for different types of test signals in the SQAM database. Experimental results indicate only a 4-7% relative error in loudness while attaining up to 80-90 % reduction in computational complexity. Similarly, a hybrid algorithm is developed specifically for use with sinusoidal signals and employs the proposed auditory pattern combining technique together with a look-up table to store representative auditory patterns. The second problem obtains an estimate of the auditory representation that minimizes a perceptual objective function and transforms the auditory pattern back to its equivalent time/frequency representation. This avoids the repeated application of auditory model stages to test different candidate time/frequency vectors in minimizing perceptual objective functions. In this dissertation, a constrained mapping scheme is developed by linearizing certain auditory model stages that ensures obtaining a time/frequency mapping corresponding to the estimated auditory representation. This paradigm was successfully incorporated in a perceptual speech enhancement algorithm and a sinusoidal component selection task.
ContributorsKrishnamoorthi, Harish (Author) / Spanias, Andreas (Thesis advisor) / Papandreou-Suppappola, Antonia (Committee member) / Tepedelenlioğlu, Cihan (Committee member) / Tsakalis, Konstantinos (Committee member) / Arizona State University (Publisher)
Created2011
149930-Thumbnail Image.png
Description
Concern regarding the quality of traffic data exists among engineers and planners tasked with obtaining and using the data for various transportation applications. While data quality issues are often understood by analysts doing the hands on work, rarely are the quality characteristics of the data effectively communicated beyond the analyst.

Concern regarding the quality of traffic data exists among engineers and planners tasked with obtaining and using the data for various transportation applications. While data quality issues are often understood by analysts doing the hands on work, rarely are the quality characteristics of the data effectively communicated beyond the analyst. This research is an exercise in measuring and reporting data quality. The assessment was conducted to support the performance measurement program at the Maricopa Association of Governments in Phoenix, Arizona, and investigates the traffic data from 228 continuous monitoring freeway sensors in the metropolitan region. Results of the assessment provide an example of describing the quality of the traffic data with each of six data quality measures suggested in the literature, which are accuracy, completeness, validity, timeliness, coverage and accessibility. An important contribution is made in the use of data quality visualization tools. These visualization tools are used in evaluating the validity of the traffic data beyond pass/fail criteria commonly used. More significantly, they serve to educate an intuitive sense or understanding of the underlying characteristics of the data considered valid. Recommendations from the experience gained in this assessment include that data quality visualization tools be developed and used in the processing and quality control of traffic data, and that these visualization tools, along with other information on the quality control effort, be stored as metadata with the processed data.
ContributorsSamuelson, Jothan P (Author) / Pendyala, Ram M. (Thesis advisor) / Ahn, Soyoung (Committee member) / Arizona State University (Publisher)
Created2011
149969-Thumbnail Image.png
Description
In the search for chemical biosensors designed for patient-based physiological applications, non-invasive diagnostic approaches continue to have value. The work described in this thesis builds upon previous breath analysis studies. In particular, it seeks to assess the adsorptive mechanisms active in both acetone and ethanol biosensors designed for

In the search for chemical biosensors designed for patient-based physiological applications, non-invasive diagnostic approaches continue to have value. The work described in this thesis builds upon previous breath analysis studies. In particular, it seeks to assess the adsorptive mechanisms active in both acetone and ethanol biosensors designed for breath analysis. The thermoelectric biosensors under investigation were constructed using a thermopile for transduction and four different materials for biorecognition. The analytes, acetone and ethanol, were evaluated under dry-air and humidified-air conditions. The biosensor response to acetone concentration was found to be both repeatable and linear, while the sensor response to ethanol presence was also found to be repeatable. The different biorecognition materials produced discernible thermoelectric responses that were characteristic for each analyte. The sensor output data is presented in this report. Additionally, the results were evaluated against a mathematical model for further analysis. Ultimately, a thermoelectric biosensor based upon adsorption chemistry was developed and characterized. Additional work is needed to characterize the physicochemical action mechanism.
ContributorsWilson, Kimberly (Author) / Guilbeau, Eric (Thesis advisor) / Pizziconi, Vincent (Thesis advisor) / LaBelle, Jeffrey (Committee member) / Arizona State University (Publisher)
Created2011
149744-Thumbnail Image.png
Description
The video game graphics pipeline has traditionally rendered the scene using a polygonal approach. Advances in modern graphics hardware now allow the rendering of parametric methods. This thesis explores various smooth surface rendering methods that can be integrated into the video game graphics engine. Moving over to parametric or smooth

The video game graphics pipeline has traditionally rendered the scene using a polygonal approach. Advances in modern graphics hardware now allow the rendering of parametric methods. This thesis explores various smooth surface rendering methods that can be integrated into the video game graphics engine. Moving over to parametric or smooth surfaces from the polygonal domain has its share of issues and there is an inherent need to address various rendering bottlenecks that could hamper such a move. The game engine needs to choose an appropriate method based on in-game characteristics of the objects; character and animated objects need more sophisticated methods whereas static objects could use simpler techniques. Scaling the polygon count over various hardware platforms becomes an important factor. Much control is needed over the tessellation levels, either imposed by the hardware limitations or by the application, to be able to adaptively render the mesh without significant loss in performance. This thesis explores several methods that would help game engine developers in making correct design choices by optimally balancing the trade-offs while rendering the scene using smooth surfaces. It proposes a novel technique for adaptive tessellation of triangular meshes that vastly improves speed and tessellation count. It develops an approximate method for rendering Loop subdivision surfaces on tessellation enabled hardware. A taxonomy and evaluation of the methods is provided and a unified rendering system that provides automatic level of detail by switching between the methods is proposed.
ContributorsAmresh, Ashish (Author) / Farin, Gerlad (Thesis advisor) / Razdan, Anshuman (Thesis advisor) / Wonka, Peter (Committee member) / Hansford, Dianne (Committee member) / Arizona State University (Publisher)
Created2011