Matching Items (2,978)
Filtering by

Clear all filters

148216-Thumbnail Image.png
Description

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develo

Time studies are an effective tool to analyze current production systems and propose improvements. The problem that motivated the project was that conducting time studies and observing the progression of components across the factory floor is a manual process. Four Industrial Engineering students worked with a manufacturing company to develop Computer Vision technology that would automate the data collection process for time studies. The team worked in an Agile environment to complete over 120 classification sets, create 8 strategy documents, and utilize Root Cause Analysis techniques to audit and validate the performance of the trained Computer Vision data models. In the future, there is an opportunity to continue developing this product and expand the team’s work scope to apply more engineering skills on the data collected to drive factory improvements.

ContributorsChmelnik, Nathan (Co-author) / de Guzman, Lorenzo (Co-author) / Johnson, Katelyn (Co-author) / Martz, Emma (Co-author) / Ju, Feng (Thesis director) / Courter, Brandon (Committee member) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
149867-Thumbnail Image.png
Description
Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding

Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding an auditory model in the objective function formulation and proposes possible solutions to overcome high complexity issues for use in real-time speech/audio algorithms. Specific problems addressed in this dissertation include: 1) the development of approximate but computationally efficient auditory model implementations that are consistent with the principles of psychoacoustics, 2) the development of a mapping scheme that allows synthesizing a time/frequency domain representation from its equivalent auditory model output. The first problem is aimed at addressing the high computational complexity involved in solving perceptual objective functions that require repeated application of auditory model for evaluation of different candidate solutions. In this dissertation, a frequency pruning and a detector pruning algorithm is developed that efficiently implements the various auditory model stages. The performance of the pruned model is compared to that of the original auditory model for different types of test signals in the SQAM database. Experimental results indicate only a 4-7% relative error in loudness while attaining up to 80-90 % reduction in computational complexity. Similarly, a hybrid algorithm is developed specifically for use with sinusoidal signals and employs the proposed auditory pattern combining technique together with a look-up table to store representative auditory patterns. The second problem obtains an estimate of the auditory representation that minimizes a perceptual objective function and transforms the auditory pattern back to its equivalent time/frequency representation. This avoids the repeated application of auditory model stages to test different candidate time/frequency vectors in minimizing perceptual objective functions. In this dissertation, a constrained mapping scheme is developed by linearizing certain auditory model stages that ensures obtaining a time/frequency mapping corresponding to the estimated auditory representation. This paradigm was successfully incorporated in a perceptual speech enhancement algorithm and a sinusoidal component selection task.
ContributorsKrishnamoorthi, Harish (Author) / Spanias, Andreas (Thesis advisor) / Papandreou-Suppappola, Antonia (Committee member) / Tepedelenlioğlu, Cihan (Committee member) / Tsakalis, Konstantinos (Committee member) / Arizona State University (Publisher)
Created2011
Description
Single cell phenotypic heterogeneity studies reveal more information about the pathogenesis process than conventional bulk methods. Furthermore, investigation of the individual cellular response mechanism during rapid environmental changes can only be achieved at single cell level. By enabling the study of cellular morphology, a single cell three-dimensional (3D) imaging system

Single cell phenotypic heterogeneity studies reveal more information about the pathogenesis process than conventional bulk methods. Furthermore, investigation of the individual cellular response mechanism during rapid environmental changes can only be achieved at single cell level. By enabling the study of cellular morphology, a single cell three-dimensional (3D) imaging system can be used to diagnose fatal diseases, such as cancer, at an early stage. One proven method, CellCT, accomplishes 3D imaging by rotating a single cell around a fixed axis. However, some existing cell rotating mechanisms require either intricate microfabrication, and some fail to provide a suitable environment for living cells. This thesis develops a microvorterx chamber that allows living cells to be rotated by hydrodynamic alone while facilitating imaging access. In this thesis work, 1) the new chamber design was developed through numerical simulation. Simulations revealed that in order to form a microvortex in the side chamber, the ratio of the chamber opening to the channel width must be smaller than one. After comparing different chamber designs, the trapezoidal side chamber was selected because it demonstrated controllable circulation and met the imaging requirements. Microvortex properties were not sensitive to the chambers with interface angles ranging from 0.32 to 0.64. A similar trend was observed when chamber heights were larger than chamber opening. 2) Micro-particle image velocimetry was used to characterize microvortices and validate simulation results. Agreement between experimentation and simulation confirmed that numerical simulation was an effective method for chamber design. 3) Finally, cell rotation experiments were performed in the trapezoidal side chamber. The experimental results demonstrated cell rotational rates ranging from 12 to 29 rpm for regular cells. With a volumetric flow rate of 0.5 µL/s, an irregular cell rotated at a mean rate of 97 ± 3 rpm. Rotational rates can be changed by altering inlet flow rates.
ContributorsZhang, Wenjie (Author) / Frakes, David (Thesis advisor) / Meldrum, Deirdre (Thesis advisor) / Chao, Shih-hui (Committee member) / Wang, Xiao (Committee member) / Arizona State University (Publisher)
Created2011
149930-Thumbnail Image.png
Description
Concern regarding the quality of traffic data exists among engineers and planners tasked with obtaining and using the data for various transportation applications. While data quality issues are often understood by analysts doing the hands on work, rarely are the quality characteristics of the data effectively communicated beyond the analyst.

Concern regarding the quality of traffic data exists among engineers and planners tasked with obtaining and using the data for various transportation applications. While data quality issues are often understood by analysts doing the hands on work, rarely are the quality characteristics of the data effectively communicated beyond the analyst. This research is an exercise in measuring and reporting data quality. The assessment was conducted to support the performance measurement program at the Maricopa Association of Governments in Phoenix, Arizona, and investigates the traffic data from 228 continuous monitoring freeway sensors in the metropolitan region. Results of the assessment provide an example of describing the quality of the traffic data with each of six data quality measures suggested in the literature, which are accuracy, completeness, validity, timeliness, coverage and accessibility. An important contribution is made in the use of data quality visualization tools. These visualization tools are used in evaluating the validity of the traffic data beyond pass/fail criteria commonly used. More significantly, they serve to educate an intuitive sense or understanding of the underlying characteristics of the data considered valid. Recommendations from the experience gained in this assessment include that data quality visualization tools be developed and used in the processing and quality control of traffic data, and that these visualization tools, along with other information on the quality control effort, be stored as metadata with the processed data.
ContributorsSamuelson, Jothan P (Author) / Pendyala, Ram M. (Thesis advisor) / Ahn, Soyoung (Committee member) / Arizona State University (Publisher)
Created2011
149969-Thumbnail Image.png
Description
In the search for chemical biosensors designed for patient-based physiological applications, non-invasive diagnostic approaches continue to have value. The work described in this thesis builds upon previous breath analysis studies. In particular, it seeks to assess the adsorptive mechanisms active in both acetone and ethanol biosensors designed for

In the search for chemical biosensors designed for patient-based physiological applications, non-invasive diagnostic approaches continue to have value. The work described in this thesis builds upon previous breath analysis studies. In particular, it seeks to assess the adsorptive mechanisms active in both acetone and ethanol biosensors designed for breath analysis. The thermoelectric biosensors under investigation were constructed using a thermopile for transduction and four different materials for biorecognition. The analytes, acetone and ethanol, were evaluated under dry-air and humidified-air conditions. The biosensor response to acetone concentration was found to be both repeatable and linear, while the sensor response to ethanol presence was also found to be repeatable. The different biorecognition materials produced discernible thermoelectric responses that were characteristic for each analyte. The sensor output data is presented in this report. Additionally, the results were evaluated against a mathematical model for further analysis. Ultimately, a thermoelectric biosensor based upon adsorption chemistry was developed and characterized. Additional work is needed to characterize the physicochemical action mechanism.
ContributorsWilson, Kimberly (Author) / Guilbeau, Eric (Thesis advisor) / Pizziconi, Vincent (Thesis advisor) / LaBelle, Jeffrey (Committee member) / Arizona State University (Publisher)
Created2011
149744-Thumbnail Image.png
Description
The video game graphics pipeline has traditionally rendered the scene using a polygonal approach. Advances in modern graphics hardware now allow the rendering of parametric methods. This thesis explores various smooth surface rendering methods that can be integrated into the video game graphics engine. Moving over to parametric or smooth

The video game graphics pipeline has traditionally rendered the scene using a polygonal approach. Advances in modern graphics hardware now allow the rendering of parametric methods. This thesis explores various smooth surface rendering methods that can be integrated into the video game graphics engine. Moving over to parametric or smooth surfaces from the polygonal domain has its share of issues and there is an inherent need to address various rendering bottlenecks that could hamper such a move. The game engine needs to choose an appropriate method based on in-game characteristics of the objects; character and animated objects need more sophisticated methods whereas static objects could use simpler techniques. Scaling the polygon count over various hardware platforms becomes an important factor. Much control is needed over the tessellation levels, either imposed by the hardware limitations or by the application, to be able to adaptively render the mesh without significant loss in performance. This thesis explores several methods that would help game engine developers in making correct design choices by optimally balancing the trade-offs while rendering the scene using smooth surfaces. It proposes a novel technique for adaptive tessellation of triangular meshes that vastly improves speed and tessellation count. It develops an approximate method for rendering Loop subdivision surfaces on tessellation enabled hardware. A taxonomy and evaluation of the methods is provided and a unified rendering system that provides automatic level of detail by switching between the methods is proposed.
ContributorsAmresh, Ashish (Author) / Farin, Gerlad (Thesis advisor) / Razdan, Anshuman (Thesis advisor) / Wonka, Peter (Committee member) / Hansford, Dianne (Committee member) / Arizona State University (Publisher)
Created2011
149854-Thumbnail Image.png
Description
There is increasing interest in the medical and behavioral health communities towards developing effective strategies for the treatment of chronic diseases. Among these lie adaptive interventions, which consider adjusting treatment dosages over time based on participant response. Control engineering offers a broad-based solution framework for optimizing the effectiveness of such

There is increasing interest in the medical and behavioral health communities towards developing effective strategies for the treatment of chronic diseases. Among these lie adaptive interventions, which consider adjusting treatment dosages over time based on participant response. Control engineering offers a broad-based solution framework for optimizing the effectiveness of such interventions. In this thesis, an approach is proposed to develop dynamical models and subsequently, hybrid model predictive control schemes for assigning optimal dosages of naltrexone, an opioid antagonist, as treatment for a chronic pain condition known as fibromyalgia. System identification techniques are employed to model the dynamics from the daily diary reports completed by participants of a blind naltrexone intervention trial. These self-reports include assessments of outcomes of interest (e.g., general pain symptoms, sleep quality) and additional external variables (disturbances) that affect these outcomes (e.g., stress, anxiety, and mood). Using prediction-error methods, a multi-input model describing the effect of drug, placebo and other disturbances on outcomes of interest is developed. This discrete time model is approximated by a continuous second order model with zero, which was found to be adequate to capture the dynamics of this intervention. Data from 40 participants in two clinical trials were analyzed and participants were classified as responders and non-responders based on the models obtained from system identification. The dynamical models can be used by a model predictive controller for automated dosage selection of naltrexone using feedback/feedforward control actions in the presence of external disturbances. The clinical requirement for categorical (i.e., discrete-valued) drug dosage levels creates a need for hybrid model predictive control (HMPC). The controller features a multiple degree-of-freedom formulation that enables the user to adjust the speed of setpoint tracking, measured disturbance rejection and unmeasured disturbance rejection independently in the closed loop system. The nominal and robust performance of the proposed control scheme is examined via simulation using system identification models from a representative participant in the naltrexone intervention trial. The controller evaluation described in this thesis gives credibility to the promise and applicability of control engineering principles for optimizing adaptive interventions.
ContributorsDeśapāṇḍe, Sunīla (Author) / Rivera, Daniel E. (Thesis advisor) / Si, Jennie (Committee member) / Tsakalis, Konstantinos (Committee member) / Arizona State University (Publisher)
Created2011
150165-Thumbnail Image.png
Description
As a term and method that is rapidly gaining popularity, Building Information Modeling (BIM) is under the scrutiny of many building professionals questioning its potential benefits on their projects. A relevant and accepted calculation methodology and baseline to properly evaluate BIM's benefits have not been established, thus there are mixed

As a term and method that is rapidly gaining popularity, Building Information Modeling (BIM) is under the scrutiny of many building professionals questioning its potential benefits on their projects. A relevant and accepted calculation methodology and baseline to properly evaluate BIM's benefits have not been established, thus there are mixed perspectives and opinions of the benefits of BIM, creating a general misunderstanding of the expected outcomes. The purpose of this thesis was to develop a more complete methodology to analyze the benefits of BIM, apply recent projects to this methodology to quantify outcomes, resulting in a more a holistic framework of BIM and its impacts on project efficiency. From the literature, a framework calculation model to determine the value of BIM is developed and presented. The developed model is applied via case studies within a large industrial setting where similar projects are evaluated, some implementing BIM and some with traditional non-BIM approaches. Cost or investment metrics were considered along with benefit or return metrics. The return metrics were: requests for information, change orders, and duration improvements. The investment metrics were: design and construction costs. The methodology was tested against three separate cases and results on the returns and investments are presented. The findings indicate that in the tool installation department of semiconductor manufacturing, there is a high potential for BIM benefits to be realized. The evidence also suggests that actual returns and investments will vary with each project.
ContributorsBarlish, Kristen Caroline (Author) / Sullivan, Kenneth T. (Thesis advisor) / Kashiwagi, Dean T. (Committee member) / Badger, William W. (Committee member) / Arizona State University (Publisher)
Created2011
Description
The study of artist transcriptions is an effective vehicle for assimilating the language and style of jazz. Pairing transcriptions with historical context provides further insight into the back story of the artists' life and method. Innovators are often the subject of published studies of this kind, but transcriptions of plunger-mute

The study of artist transcriptions is an effective vehicle for assimilating the language and style of jazz. Pairing transcriptions with historical context provides further insight into the back story of the artists' life and method. Innovators are often the subject of published studies of this kind, but transcriptions of plunger-mute master Al Grey have been overlooked. This document fills that void, combining historical context with thirteen transcriptions of Grey's trombone features and improvisations. Selection of transcribed materials was based on an examination of historically significant solos in Al Grey's fifty-five-year career. The results are a series of open-horn and plunger solos that showcase Grey's sound, technical brilliance, and wide range of dynamics and articulation. This collection includes performances from a mix of widely available and obscure recordings, the majority coming from engagements with the Count Basie Orchestra. Methods learned from the study of Al Grey's book Plunger Techniques were vital in the realization of his work. The digital transcription software Amazing Slow Downer by Roni Music aided in deciphering some of Grey's more complicated passages and, with octave displacement, helped bring previously inaudible moments to the foreground.
ContributorsHopkins, Charles E (Author) / Pilafian, Sam (Thesis advisor) / Stauffer, Sandra (Committee member) / Solís, Ted (Committee member) / Ericson, John (Committee member) / Kocour, Michael (Committee member) / Arizona State University (Publisher)
Created2011
150213-Thumbnail Image.png
Description
Semiconductor nanowires (NWs) are one dimensional materials and have size quantization effect when the diameter is sufficiently small. They can serve as optical wave guides along the length direction and contain optically active gain at the same time. Due to these unique properties, NWs are now very promising and extensively

Semiconductor nanowires (NWs) are one dimensional materials and have size quantization effect when the diameter is sufficiently small. They can serve as optical wave guides along the length direction and contain optically active gain at the same time. Due to these unique properties, NWs are now very promising and extensively studied for nanoscale optoelectronic applications. A systematic and comprehensive optical and microstructural study of several important infrared semiconductor NWs is presented in this thesis, which includes InAs, PbS, InGaAs, erbium chloride silicate and erbium silicate. Micro-photoluminescence (PL) and transmission electron microscope (TEM) were utilized in conjunction to characterize the optical and microstructure of these wires. The focus of this thesis is on optical study of semiconductor NWs in the mid-infrared wavelengths. First, differently structured InAs NWs grown using various methods were characterized and compared. Three main PL peaks which are below, near and above InAs bandgap, respectively, were observed. The octadecylthiol self-assembled monolayer was employed to passivate the surface of InAs NWs to eliminate or reduce the effects of the surface states. The band-edge emission from wurtzite-structured NWs was completely recovered after passivatoin. The passivated NWs showed very good stability in air and under heat. In the second part, mid-infrared optical study was conducted on PbS wires of subwavelength diameter and lasing was demonstrated under optical pumping. The PbS wires were grown on Si substrate using chemical vapor deposition and have a rock-salt cubic structure. Single-mode lasing at the wavelength of ~3000-4000 nm was obtained from single as-grown PbS wire up to the temperature of 115 K. PL characterization was also utilized to demonstrate the highest crystallinity of the vertical arrays of InP and InGaAs/InP composition-graded heterostructure NWs made by a top-down fabrication method. TEM-related measurements were performed to study the crystal structures and elemental compositions of the Er-compound core-shell NWs. The core-shell NWs consist of an orthorhombic-structured erbium chloride silicate shell and a cubic-structured silicon core. These NWs provide unique Si-compatible materials with emission at 1530 nm for optical communications and solid state lasers.
ContributorsSun, Minghua (Author) / Ning, Cun-Zheng (Thesis advisor) / Yu, Hongbin (Committee member) / Carpenter, Ray W. (Committee member) / Johnson, Shane (Committee member) / Arizona State University (Publisher)
Created2011