This collection includes both ASU Theses and Dissertations, submitted by graduate students, and the Barrett, Honors College theses submitted by undergraduate students. 

Displaying 1 - 10 of 347
Filtering by

Clear all filters

149977-Thumbnail Image.png
Description
Reliable extraction of human pose features that are invariant to view angle and body shape changes is critical for advancing human movement analysis. In this dissertation, the multifactor analysis techniques, including the multilinear analysis and the multifactor Gaussian process methods, have been exploited to extract such invariant pose features from

Reliable extraction of human pose features that are invariant to view angle and body shape changes is critical for advancing human movement analysis. In this dissertation, the multifactor analysis techniques, including the multilinear analysis and the multifactor Gaussian process methods, have been exploited to extract such invariant pose features from video data by decomposing various key contributing factors, such as pose, view angle, and body shape, in the generation of the image observations. Experimental results have shown that the resulting pose features extracted using the proposed methods exhibit excellent invariance properties to changes in view angles and body shapes. Furthermore, using the proposed invariant multifactor pose features, a suite of simple while effective algorithms have been developed to solve the movement recognition and pose estimation problems. Using these proposed algorithms, excellent human movement analysis results have been obtained, and most of them are superior to those obtained from state-of-the-art algorithms on the same testing datasets. Moreover, a number of key movement analysis challenges, including robust online gesture spotting and multi-camera gesture recognition, have also been addressed in this research. To this end, an online gesture spotting framework has been developed to automatically detect and learn non-gesture movement patterns to improve gesture localization and recognition from continuous data streams using a hidden Markov network. In addition, the optimal data fusion scheme has been investigated for multicamera gesture recognition, and the decision-level camera fusion scheme using the product rule has been found to be optimal for gesture recognition using multiple uncalibrated cameras. Furthermore, the challenge of optimal camera selection in multi-camera gesture recognition has also been tackled. A measure to quantify the complementary strength across cameras has been proposed. Experimental results obtained from a real-life gesture recognition dataset have shown that the optimal camera combinations identified according to the proposed complementary measure always lead to the best gesture recognition results.
ContributorsPeng, Bo (Author) / Qian, Gang (Thesis advisor) / Ye, Jieping (Committee member) / Li, Baoxin (Committee member) / Spanias, Andreas (Committee member) / Arizona State University (Publisher)
Created2011
149993-Thumbnail Image.png
Description
Many products undergo several stages of testing ranging from tests on individual components to end-item tests. Additionally, these products may be further "tested" via customer or field use. The later failure of a delivered product may in some cases be due to circumstances that have no correlation with the product's

Many products undergo several stages of testing ranging from tests on individual components to end-item tests. Additionally, these products may be further "tested" via customer or field use. The later failure of a delivered product may in some cases be due to circumstances that have no correlation with the product's inherent quality. However, at times, there may be cues in the upstream test data that, if detected, could serve to predict the likelihood of downstream failure or performance degradation induced by product use or environmental stresses. This study explores the use of downstream factory test data or product field reliability data to infer data mining or pattern recognition criteria onto manufacturing process or upstream test data by means of support vector machines (SVM) in order to provide reliability prediction models. In concert with a risk/benefit analysis, these models can be utilized to drive improvement of the product or, at least, via screening to improve the reliability of the product delivered to the customer. Such models can be used to aid in reliability risk assessment based on detectable correlations between the product test performance and the sources of supply, test stands, or other factors related to product manufacture. As an enhancement to the usefulness of the SVM or hyperplane classifier within this context, L-moments and the Western Electric Company (WECO) Rules are used to augment or replace the native process or test data used as inputs to the classifier. As part of this research, a generalizable binary classification methodology was developed that can be used to design and implement predictors of end-item field failure or downstream product performance based on upstream test data that may be composed of single-parameter, time-series, or multivariate real-valued data. Additionally, the methodology provides input parameter weighting factors that have proved useful in failure analysis and root cause investigations as indicators of which of several upstream product parameters have the greater influence on the downstream failure outcomes.
ContributorsMosley, James (Author) / Morrell, Darryl (Committee member) / Cochran, Douglas (Committee member) / Papandreou-Suppappola, Antonia (Committee member) / Roberts, Chell (Committee member) / Spanias, Andreas (Committee member) / Arizona State University (Publisher)
Created2011
149780-Thumbnail Image.png
Description
The demand for handheld portable computing in education, business and research has resulted in advanced mobile devices with powerful processors and large multi-touch screens. Such devices are capable of handling tasks of moderate computational complexity such as word processing, complex Internet transactions, and even human motion analysis. Apple's iOS devices,

The demand for handheld portable computing in education, business and research has resulted in advanced mobile devices with powerful processors and large multi-touch screens. Such devices are capable of handling tasks of moderate computational complexity such as word processing, complex Internet transactions, and even human motion analysis. Apple's iOS devices, including the iPhone, iPod touch and the latest in the family - the iPad, are among the well-known and widely used mobile devices today. Their advanced multi-touch interface and improved processing power can be exploited for engineering and STEM demonstrations. Moreover, these devices have become a part of everyday student life. Hence, the design of exciting mobile applications and software represents a great opportunity to build student interest and enthusiasm in science and engineering. This thesis presents the design and implementation of a portable interactive signal processing simulation software on the iOS platform. The iOS-based object-oriented application is called i-JDSP and is based on the award winning Java-DSP concept. It is implemented in Objective-C and C as a native Cocoa Touch application that can be run on any iOS device. i-JDSP offers basic signal processing simulation functions such as Fast Fourier Transform, filtering, spectral analysis on a compact and convenient graphical user interface and provides a very compelling multi-touch programming experience. Built-in modules also demonstrate concepts such as the Pole-Zero Placement. i-JDSP also incorporates sound capture and playback options that can be used in near real-time analysis of speech and audio signals. All simulations can be visually established by forming interactive block diagrams through multi-touch and drag-and-drop. Computations are performed on the mobile device when necessary, making the block diagram execution fast. Furthermore, the extensive support for user interactivity provides scope for improved learning. The results of i-JDSP assessment among senior undergraduate and first year graduate students revealed that the software created a significant positive impact and increased the students' interest and motivation and in understanding basic DSP concepts.
ContributorsLiu, Jinru (Author) / Spanias, Andreas (Thesis advisor) / Tsakalis, Kostas (Committee member) / Qian, Gang (Committee member) / Arizona State University (Publisher)
Created2011
150380-Thumbnail Image.png
Description
Great advances have been made in the construction of photovoltaic (PV) cells and modules, but array level management remains much the same as it has been in previous decades. Conventionally, the PV array is connected in a fixed topology which is not always appropriate in the presence of faults in

Great advances have been made in the construction of photovoltaic (PV) cells and modules, but array level management remains much the same as it has been in previous decades. Conventionally, the PV array is connected in a fixed topology which is not always appropriate in the presence of faults in the array, and varying weather conditions. With the introduction of smarter inverters and solar modules, the data obtained from the photovoltaic array can be used to dynamically modify the array topology and improve the array power output. This is beneficial especially when module mismatches such as shading, soiling and aging occur in the photovoltaic array. This research focuses on the topology optimization of PV arrays under shading conditions using measurements obtained from a PV array set-up. A scheme known as topology reconfiguration method is proposed to find the optimal array topology for a given weather condition and faulty module information. Various topologies such as the series-parallel (SP), the total cross-tied (TCT), the bridge link (BL) and their bypassed versions are considered. The topology reconfiguration method compares the efficiencies of the topologies, evaluates the percentage gain in the generated power that would be obtained by reconfiguration of the array and other factors to find the optimal topology. This method is employed for various possible shading patterns to predict the best topology. The results demonstrate the benefit of having an electrically reconfigurable array topology. The effects of irradiance and shading on the array performance are also studied. The simulations are carried out using a SPICE simulator. The simulation results are validated with the experimental data provided by the PACECO Company.
ContributorsBuddha, Santoshi Tejasri (Author) / Spanias, Andreas (Thesis advisor) / Tepedelenlioğlu, Cihan (Thesis advisor) / Zhang, Junshan (Committee member) / Arizona State University (Publisher)
Created2011
147834-Thumbnail Image.png
Description

From exploring coffee plantations with an old Irishman in the mountains of Colombia to watching the sun set over the Strait of Gibraltar from the terrace of an ancient Moroccan cafe, this thesis sent Charles and Zane on an elaborate cafe-crawl across ten countries, with stops at a few of

From exploring coffee plantations with an old Irishman in the mountains of Colombia to watching the sun set over the Strait of Gibraltar from the terrace of an ancient Moroccan cafe, this thesis sent Charles and Zane on an elaborate cafe-crawl across ten countries, with stops at a few of the world’s most interesting coffee houses. Some of these cafes, such as the world-renowned Caffé Florian (opened in 1720) and Caffé Greco (1760), are built on long-standing traditions. Others are led by innovators championing high-quality boutique shops, challenging mass production chains such as Starbucks and Tim Hortons. These newer cafes fuel a movement classified as the “Third Wave”. With a foundation gained from specialized courses with Patrick O’Malley, North America’s leading voice in coffee, Zane and Charles conducted first-hand research into the unique coffee preferences of multiple cultures, the emergence and impact of the Third Wave in these countries, and what the future may hold for coffee lovers.

ContributorsFerguson, Charles William (Co-author) / Jarecke, Zane (Co-author) / Eaton, John (Thesis director) / Bonfiglio, Thomas (Committee member) / Dean, W.P. Carey School of Business (Contributor, Contributor) / Department of Marketing (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
147835-Thumbnail Image.png
Description

From exploring coffee plantations with an old Irishman in the mountains of Colombia to watching the sun set over the Strait of Gibraltar from the terrace of an ancient Moroccan cafe, this thesis sent Charles and Zane on an elaborate cafe-crawl across ten countries, with stops at a few of

From exploring coffee plantations with an old Irishman in the mountains of Colombia to watching the sun set over the Strait of Gibraltar from the terrace of an ancient Moroccan cafe, this thesis sent Charles and Zane on an elaborate cafe-crawl across ten countries, with stops at a few of the world’s most interesting coffee houses. Some of these cafes, such as the world-renowned Caffé Florian (opened in 1720) and Caffé Greco (1760), are built on long-standing traditions. Others are led by innovators championing high-quality boutique shops, challenging mass production chains such as Starbucks and Tim Hortons. These newer cafes fuel a movement classified as the “Third Wave”. With a foundation gained from specialized courses with Patrick O’Malley, North America’s leading voice in coffee, Zane and Charles conducted first-hand research into the unique coffee preferences of multiple cultures, the emergence and impact of the Third Wave in these countries, and what the future may hold for coffee lovers.

ContributorsJarecke, Zane Micheal (Co-author) / Ferguson, Charles (Co-author) / Eaton, John (Thesis director) / Bonfiglio, Thomas (Committee member) / Dean, W.P. Carey School of Business (Contributor) / Department of Management and Entrepreneurship (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
Description

In the early years of the National Football League, scouting and roster development resembled the wild west. Drafts were held in hotel ballrooms the day after the last game of regular season college football was played. There was no combine, limited scouting, and no salary cap. Over time, these aspects

In the early years of the National Football League, scouting and roster development resembled the wild west. Drafts were held in hotel ballrooms the day after the last game of regular season college football was played. There was no combine, limited scouting, and no salary cap. Over time, these aspects have changed dramatically, in part due to key figures from Pete Rozelle to Gil Brandt to Bill Belichick. The development and learning from this time period have laid the foundational infrastructure that modern roster construction is based upon. In this modern day, managing a team and putting together a roster involves numerous people, intense scouting, layers of technology, and, critically, the management of the salary cap. Since it was first put into place in 1994, managing the cap has become an essential element of building and sustaining a successful team. The New England Patriots’ mastery of the cap is a large part of what enabled their dynastic run over the past twenty years. While their model has undoubtedly proven to be successful, an opposing model has become increasingly popular and yielded results of its own. Both models center around different distributions of the salary cap, starting with the portion paid to the starting quarterback. The Patriots dynasty was, in part, made possible due to their use of both models over the course of their dominance. Drafting, organizational culture, and coaching are all among the numerous critical factors in determining a team’s success and it becomes difficult to pinpoint the true source of success for any given team. Ultimately, however, effective management of the cap proves to be a force multiplier; it does not guarantee that a team will be successful, but it helps teams that handle the other variables well sustain their success.

ContributorsBolger, William (Author) / Eaton, John (Thesis director) / Mokwa, Michael (Committee member) / Department of Marketing (Contributor) / Sandra Day O'Connor College of Law (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
147816-Thumbnail Image.png
Description

Especially during the current COVID-19 pandemic and age of social unrest in the United States, there has been an increasing need for comfort, yet the idea of comfort is quite vague and rarely elaborated upon. To simplify the idea of comfort and communicate the ideas around it effectively, I am

Especially during the current COVID-19 pandemic and age of social unrest in the United States, there has been an increasing need for comfort, yet the idea of comfort is quite vague and rarely elaborated upon. To simplify the idea of comfort and communicate the ideas around it effectively, I am defining comfort as a subset of escapism in which a person escapes to reduce or alleviate feelings of grief or distress. As companies rush to comfort their customers in this current state of uncertainty, marketers are pressed to identify people’s insecurities and comfort them without coming off as insensitive or trite. Current comfort marketing focuses on inspiring nostalgia in its customers, having them recall previous positive experiences or feelings to comfort them. Nostalgic marketing techniques may ease mild grief in some cases, but using them to alleviate severe distress probably will not be as effective, and has contributed to several seemingly out-of-touch “COVID-19 era” commercials.<br/>When addressing comfort, marketers should understand the type and hierarchy of comfort that they are catering to. Not all comforts are equal, in that some comforts make us feel better than others and some do not comfort us at all. A better understanding of how and why comforts change among different individuals, and possibly being able to predict the comfort preference based on a product or service, will help marketers market their goods and services more effectively. By diversifying and specializing comfort marketing using this hierarchical method, marketers will be able to more significantly reach their customers during “uncertain times.”

ContributorsTarpley, Rachel Michelle (Author) / Eaton, John (Thesis director) / Mokwa, Michael (Committee member) / Department of Management and Entrepreneurship (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
147826-Thumbnail Image.png
Description

The purpose of this paper is to raise awareness about the problem nonrevenue sports face today by analyzing the key factors of the failing Division 1 model and providing some unforeseen consequences in the elimination of nonrevenue sports. The first section will explore the elimination and financial trends of NCAA

The purpose of this paper is to raise awareness about the problem nonrevenue sports face today by analyzing the key factors of the failing Division 1 model and providing some unforeseen consequences in the elimination of nonrevenue sports. The first section will explore the elimination and financial trends of NCAA Division 1 in a historical and contemporary context. The second section will provide the deep-rooted problems associated with collegiate sports. Lastly, the third section will analyze unforeseen consequences for athletic departments that should be accounted for when contemplating the elimination of a nonrevenue program.

ContributorsBelshay, Cade Michael (Author) / Eaton, John (Thesis director) / Mowka, Michael (Committee member) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
149867-Thumbnail Image.png
Description
Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding

Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding an auditory model in the objective function formulation and proposes possible solutions to overcome high complexity issues for use in real-time speech/audio algorithms. Specific problems addressed in this dissertation include: 1) the development of approximate but computationally efficient auditory model implementations that are consistent with the principles of psychoacoustics, 2) the development of a mapping scheme that allows synthesizing a time/frequency domain representation from its equivalent auditory model output. The first problem is aimed at addressing the high computational complexity involved in solving perceptual objective functions that require repeated application of auditory model for evaluation of different candidate solutions. In this dissertation, a frequency pruning and a detector pruning algorithm is developed that efficiently implements the various auditory model stages. The performance of the pruned model is compared to that of the original auditory model for different types of test signals in the SQAM database. Experimental results indicate only a 4-7% relative error in loudness while attaining up to 80-90 % reduction in computational complexity. Similarly, a hybrid algorithm is developed specifically for use with sinusoidal signals and employs the proposed auditory pattern combining technique together with a look-up table to store representative auditory patterns. The second problem obtains an estimate of the auditory representation that minimizes a perceptual objective function and transforms the auditory pattern back to its equivalent time/frequency representation. This avoids the repeated application of auditory model stages to test different candidate time/frequency vectors in minimizing perceptual objective functions. In this dissertation, a constrained mapping scheme is developed by linearizing certain auditory model stages that ensures obtaining a time/frequency mapping corresponding to the estimated auditory representation. This paradigm was successfully incorporated in a perceptual speech enhancement algorithm and a sinusoidal component selection task.
ContributorsKrishnamoorthi, Harish (Author) / Spanias, Andreas (Thesis advisor) / Papandreou-Suppappola, Antonia (Committee member) / Tepedelenlioğlu, Cihan (Committee member) / Tsakalis, Konstantinos (Committee member) / Arizona State University (Publisher)
Created2011