This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 135
151716-Thumbnail Image.png
Description
The rapid escalation of technology and the widespread emergence of modern technological equipments have resulted in the generation of humongous amounts of digital data (in the form of images, videos and text). This has expanded the possibility of solving real world problems using computational learning frameworks. However, while gathering a

The rapid escalation of technology and the widespread emergence of modern technological equipments have resulted in the generation of humongous amounts of digital data (in the form of images, videos and text). This has expanded the possibility of solving real world problems using computational learning frameworks. However, while gathering a large amount of data is cheap and easy, annotating them with class labels is an expensive process in terms of time, labor and human expertise. This has paved the way for research in the field of active learning. Such algorithms automatically select the salient and exemplar instances from large quantities of unlabeled data and are effective in reducing human labeling effort in inducing classification models. To utilize the possible presence of multiple labeling agents, there have been attempts towards a batch mode form of active learning, where a batch of data instances is selected simultaneously for manual annotation. This dissertation is aimed at the development of novel batch mode active learning algorithms to reduce manual effort in training classification models in real world multimedia pattern recognition applications. Four major contributions are proposed in this work: $(i)$ a framework for dynamic batch mode active learning, where the batch size and the specific data instances to be queried are selected adaptively through a single formulation, based on the complexity of the data stream in question, $(ii)$ a batch mode active learning strategy for fuzzy label classification problems, where there is an inherent imprecision and vagueness in the class label definitions, $(iii)$ batch mode active learning algorithms based on convex relaxations of an NP-hard integer quadratic programming (IQP) problem, with guaranteed bounds on the solution quality and $(iv)$ an active matrix completion algorithm and its application to solve several variants of the active learning problem (transductive active learning, multi-label active learning, active feature acquisition and active learning for regression). These contributions are validated on the face recognition and facial expression recognition problems (which are commonly encountered in real world applications like robotics, security and assistive technology for the blind and the visually impaired) and also on collaborative filtering applications like movie recommendation.
ContributorsChakraborty, Shayok (Author) / Panchanathan, Sethuraman (Thesis advisor) / Balasubramanian, Vineeth N. (Committee member) / Li, Baoxin (Committee member) / Mittelmann, Hans (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2013
152043-Thumbnail Image.png
Description
The main objective of this study is to investigate the mechanical behaviour of cementitious based composites subjected dynamic tensile loading, with effects of strain rate, temperature, addition of short fibres etc. Fabric pullout model and tension stiffening model based on finite difference model, previously developed at Arizona State University were

The main objective of this study is to investigate the mechanical behaviour of cementitious based composites subjected dynamic tensile loading, with effects of strain rate, temperature, addition of short fibres etc. Fabric pullout model and tension stiffening model based on finite difference model, previously developed at Arizona State University were used to help study the bonding mechanism between fibre and matrix, and the phenomenon of tension stiffening due to the addition of fibres and textiles. Uniaxial tension tests were conducted on strain-hardening cement-based composites (SHCC), textile reinforced concrete (TRC) with and without addition of short fibres, at the strain rates ranging from 25 s-1 to 100 s-1. Historical data on quasi-static tests of same materials were used to demonstrate the effects including increases in average tensile strength, strain capacity, work-to-fracture due to high strain rate. Polyvinyl alcohol (PVA), glass, polypropylene were employed as reinforcements of concrete. A state-of-the-art phantom v7 high speed camera was setup to record the video at frame rate of 10,000 fps. Random speckle pattern of texture style was made on the surface of specimens for image analysis. An optical non-contacting deformation measurement technique referred to as digital image correlation (DIC) method was used to conduct the image analysis by means of tracking the displacement field through comparison between the reference image and deformed images. DIC successfully obtained full-filed strain distribution, strain versus time responses, demonstrated the bonding mechanism from perspective of strain field, and corrected the stress-strain responses.
ContributorsYao, Yiming (Author) / Barzin, Mobasher (Thesis advisor) / Rajan, Subramaniam D. (Committee member) / Neithalath, Narayanan (Committee member) / Arizona State University (Publisher)
Created2013
151926-Thumbnail Image.png
Description
In recent years, machine learning and data mining technologies have received growing attention in several areas such as recommendation systems, natural language processing, speech and handwriting recognition, image processing and biomedical domain. Many of these applications which deal with physiological and biomedical data require person specific or person adaptive systems.

In recent years, machine learning and data mining technologies have received growing attention in several areas such as recommendation systems, natural language processing, speech and handwriting recognition, image processing and biomedical domain. Many of these applications which deal with physiological and biomedical data require person specific or person adaptive systems. The greatest challenge in developing such systems is the subject-dependent data variations or subject-based variability in physiological and biomedical data, which leads to difference in data distributions making the task of modeling these data, using traditional machine learning algorithms, complex and challenging. As a result, despite the wide application of machine learning, efficient deployment of its principles to model real-world data is still a challenge. This dissertation addresses the problem of subject based variability in physiological and biomedical data and proposes person adaptive prediction models based on novel transfer and active learning algorithms, an emerging field in machine learning. One of the significant contributions of this dissertation is a person adaptive method, for early detection of muscle fatigue using Surface Electromyogram signals, based on a new multi-source transfer learning algorithm. This dissertation also proposes a subject-independent algorithm for grading the progression of muscle fatigue from 0 to 1 level in a test subject, during isometric or dynamic contractions, at real-time. Besides subject based variability, biomedical image data also varies due to variations in their imaging techniques, leading to distribution differences between the image databases. Hence a classifier learned on one database may perform poorly on the other database. Another significant contribution of this dissertation has been the design and development of an efficient biomedical image data annotation framework, based on a novel combination of transfer learning and a new batch-mode active learning method, capable of addressing the distribution differences across databases. The methodologies developed in this dissertation are relevant and applicable to a large set of computing problems where there is a high variation of data between subjects or sources, such as face detection, pose detection and speech recognition. From a broader perspective, these frameworks can be viewed as a first step towards design of automated adaptive systems for real world data.
ContributorsChattopadhyay, Rita (Author) / Panchanathan, Sethuraman (Thesis advisor) / Ye, Jieping (Thesis advisor) / Li, Baoxin (Committee member) / Santello, Marco (Committee member) / Arizona State University (Publisher)
Created2013
151963-Thumbnail Image.png
Description
Currently, to interact with computer based systems one needs to learn the specific interface language of that system. In most cases, interaction would be much easier if it could be done in natural language. For that, we will need a module which understands natural language and automatically translates it to

Currently, to interact with computer based systems one needs to learn the specific interface language of that system. In most cases, interaction would be much easier if it could be done in natural language. For that, we will need a module which understands natural language and automatically translates it to the interface language of the system. NL2KR (Natural language to knowledge representation) v.1 system is a prototype of such a system. It is a learning based system that learns new meanings of words in terms of lambda-calculus formulas given an initial lexicon of some words and their meanings and a training corpus of sentences with their translations. As a part of this thesis, we take the prototype NL2KR v.1 system and enhance various components of it to make it usable for somewhat substantial and useful interface languages. We revamped the lexicon learning components, Inverse-lambda and Generalization modules, and redesigned the lexicon learning algorithm which uses these components to learn new meanings of words. Similarly, we re-developed an inbuilt parser of the system in Answer Set Programming (ASP) and also integrated external parser with the system. Apart from this, we added some new rich features like various system configurations and memory cache in the learning component of the NL2KR system. These enhancements helped in learning more meanings of the words, boosted performance of the system by reducing the computation time by a factor of 8 and improved the usability of the system. We evaluated the NL2KR system on iRODS domain. iRODS is a rule-oriented data system, which helps in managing large set of computer files using policies. This system provides a Rule-Oriented interface langauge whose syntactic structure is like any procedural programming language (eg. C). However, direct translation of natural language (NL) to this interface language is difficult. So, for automatic translation of NL to this language, we define a simple intermediate Policy Declarative Language (IPDL) to represent the knowledge in the policies, which then can be directly translated to iRODS rules. We develop a corpus of 100 policy statements and manually translate them to IPDL langauge. This corpus is then used for the evaluation of NL2KR system. We performed 10 fold cross validation on the system. Furthermore, using this corpus, we illustrate how different components of our NL2KR system work.
ContributorsKumbhare, Kanchan Ravishankar (Author) / Baral, Chitta (Thesis advisor) / Ye, Jieping (Committee member) / Li, Baoxin (Committee member) / Arizona State University (Publisher)
Created2013
151367-Thumbnail Image.png
Description
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on

This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
ContributorsDeivanayagam, Arumugam (Author) / Rajan, Subramaniam D. (Thesis advisor) / Mobasher, Barzin (Committee member) / Neithalath, Narayanan (Committee member) / Arizona State University (Publisher)
Created2012
151435-Thumbnail Image.png
Description
The main objective of this study is to develop an innovative system in the form of a sandwich panel type composite with textile reinforced skins and aerated concrete core. Existing theoretical concepts along with extensive experimental investigations were utilized to characterize the behavior of cement based systems in the presence

The main objective of this study is to develop an innovative system in the form of a sandwich panel type composite with textile reinforced skins and aerated concrete core. Existing theoretical concepts along with extensive experimental investigations were utilized to characterize the behavior of cement based systems in the presence of individual fibers and textile yarns. Part of this thesis is based on a material model developed here in Arizona State University to simulate experimental flexural response and back calculate tensile response. This concept is based on a constitutive law consisting of a tri-linear tension model with residual strength and a bilinear elastic perfectly plastic compression stress strain model. This parametric model was used to characterize Textile Reinforced Concrete (TRC) with aramid, carbon, alkali resistant glass, polypropylene TRC and hybrid systems of aramid and polypropylene. The same material model was also used to characterize long term durability issues with glass fiber reinforced concrete (GFRC). Historical data associated with effect of temperature dependency in aging of GFRC composites were used. An experimental study was conducted to understand the behavior of aerated concrete systems under high stain rate impact loading. Test setup was modeled on a free fall drop of an instrumented hammer using three point bending configuration. Two types of aerated concrete: autoclaved aerated concrete (AAC) and polymeric fiber-reinforced aerated concrete (FRAC) were tested and compared in terms of their impact behavior. The effect of impact energy on the mechanical properties was investigated for various drop heights and different specimen sizes. Both materials showed similar flexural load carrying capacity under impact, however, flexural toughness of fiber-reinforced aerated concrete was proved to be several degrees higher in magnitude than that provided by plain autoclaved aerated concrete. Effect of specimen size and drop height on the impact response of AAC and FRAC was studied and discussed. Results obtained were compared to the performance of sandwich beams with AR glass textile skins with aerated concrete core under similar impact conditions. After this extensive study it was concluded that this type of sandwich composite could be effectively used in low cost sustainable infrastructure projects.
ContributorsDey, Vikram (Author) / Mobasher, Barzin (Thesis advisor) / Rajan, Subramaniam D. (Committee member) / Neithalath, Narayanan (Committee member) / Arizona State University (Publisher)
Created2012
152317-Thumbnail Image.png
Description
Nuclear magnetic resonance (NMR) is an important phenomenon involving nuclear magnetic moments in magnetic field, which can provide much information about a wide range of materials, including their chemical composition, chemical environments and nuclear spin interactions. The NMR spectrometer has been extensively developed and used in many areas of research.

Nuclear magnetic resonance (NMR) is an important phenomenon involving nuclear magnetic moments in magnetic field, which can provide much information about a wide range of materials, including their chemical composition, chemical environments and nuclear spin interactions. The NMR spectrometer has been extensively developed and used in many areas of research. In this thesis, studies in two different areas using NMR are presented. First, a new kind of nanoparticle, Gd(DTPA) intercalated layered double hydroxide (LDH), has been successfully synthesized in the laboratory of Prof. Dey in SEMTE at ASU. In Chapter II, the NMR relaxation studies of two types of LDH (Mg, Al-LDH and Zn, Al-LDH) are presented and the results show that when they are intercalated with Gd(DTPA) they have a higher relaxivity than current commercial magnetic resonance imaging (MRI) contrast agents, such as DTPA in water solution. So this material may be useful as an MRI contrast agent. Several conditions were examined, such as nanoparticle size, pH and intercalation percentage, to determine the optimal relaxivity of this nanoparticle. Further NMR studies and simulations were conducted to provide an explanation for the high relaxivity. Second, fly ash is a kind of cementitious material, which has been of great interest because, when activated by an alkaline solution, it exhibits the capability for replacing ordinary Portland cement as a concrete binder. However, the reaction of activated fly ash is not fully understood. In chapter III, pore structure and NMR studies of activated fly ash using different activators, including NaOH and KOH (4M and 8M) and Na/K silicate, are presented. The pore structure, degree of order and proportion of different components in the reaction product were obtained, which reveal much about the reaction and makeup of the final product.
ContributorsPeng, Zihui (Author) / Marzke, Robert F (Thesis advisor) / Dey, Sandwip Kumar (Committee member) / Neithalath, Narayanan (Committee member) / Chamberlin, Ralph Vary (Committee member) / Mccartney, Martha Rogers (Committee member) / Arizona State University (Publisher)
Created2013
152580-Thumbnail Image.png
Description
Tall buildings are spreading across the globe at an ever-increasing rate (www.ctbuh.org). The global number of buildings 200m or more in height has risen from 286 to 602 in the last decade alone. The increasing complexity of building architecture poses unique challenges in the structural design of modern tall buildings.

Tall buildings are spreading across the globe at an ever-increasing rate (www.ctbuh.org). The global number of buildings 200m or more in height has risen from 286 to 602 in the last decade alone. The increasing complexity of building architecture poses unique challenges in the structural design of modern tall buildings. Hence, innovative structural systems need to be evaluated to create an economical design that satisfies multiple design criteria. Design using traditional trial-and-error approach can be extremely time-consuming and the resultant design uneconomical. Thus, there is a need for an efficient numerical optimization tool that can explore and generate several design alternatives in the preliminary design phase which can lead to a more desirable final design. In this study, we present the details of a tool that can be very useful in preliminary design optimization - finite element modeling, design optimization, translating design code requirements into components of the FE and design optimization models, and pre-and post-processing to verify the veracity of the model. Emphasis is placed on development and deployment of various FE models (static, modal and dynamic analyses; linear, beam and plate/shell finite elements), design optimization problem formulation (sizing, shape, topology and material selection optimization) and numerical optimization tools (gradient-based and evolutionary optimization methods) [Rajan, 2001]. The design optimization results of full scale three dimensional buildings subject to multiple design criteria including stress, serviceability and dynamic response are discussed.
ContributorsSirigiri, Mamatha (Author) / Rajan, Subramaniam D. (Thesis advisor) / Neithalath, Narayanan (Committee member) / Mobasher, Barzin (Committee member) / Arizona State University (Publisher)
Created2014
152334-Thumbnail Image.png
Description
This study focused on investigating the ability of a polymeric-enhanced high-tenacity fabric composite called CarbonFlex to mitigate damages from multi-natural hazards, which are earthquakes and tornadoes, in wood-framed structures. Typically, wood-framed shear wall is a seismic protection system used in low-rise wood structures. It is well-known that the main energy

This study focused on investigating the ability of a polymeric-enhanced high-tenacity fabric composite called CarbonFlex to mitigate damages from multi-natural hazards, which are earthquakes and tornadoes, in wood-framed structures. Typically, wood-framed shear wall is a seismic protection system used in low-rise wood structures. It is well-known that the main energy dissipation of the system is its fasteners (nails) which are not enough to dissipate energy leading to decreasing of structure's integrity. Moreover, wood shear walls could not sustain their stiffness after experiencing moderate wall drift which made them susceptible to strong aftershocks. Therefore, CarbonFlex shear wall system was proposed to be used in the wood-framed structures. Seven full-size CarbonFlex shear walls and a CarbonFlex wrapped structures were tested. The results were compared to those of conventional wood-framed shear walls and a wood structure. The comparisons indicated that CarbonFlex specimens could sustain their strength and fully recover their initial stiffness although they experienced four percent story drift while the stiffness of the conventional structure dramatically degraded. This indicated that CarbonFlex shear wall systems provided a better seismic protection to wood-framed structures. To evaluate capability of CarbonFlex to resist impact damages from wind-borne debris in tornadoes, several debris impact tests of CarbonFlex and a carbon fiber reinforced storm shelter's wall panels were conducted. The results showed that three CarbonFlex wall panels passed the test at the highest debris impact speed and the other two passed the test at the second highest speed while the carbon fiber panel failed both impact speeds.
ContributorsDhiradhamvit, Kittinan (Author) / Attard, Thomas L (Thesis advisor) / Fafitis, Apostolos (Thesis advisor) / Neithalath, Narayanan (Committee member) / Thomas, Benjamin (Committee member) / Arizona State University (Publisher)
Created2013
152770-Thumbnail Image.png
Description
Texture analysis plays an important role in applications like automated pattern inspection, image and video compression, content-based image retrieval, remote-sensing, medical imaging and document processing, to name a few. Texture Structure Analysis is the process of studying the structure present in the textures. This structure can be expressed in terms

Texture analysis plays an important role in applications like automated pattern inspection, image and video compression, content-based image retrieval, remote-sensing, medical imaging and document processing, to name a few. Texture Structure Analysis is the process of studying the structure present in the textures. This structure can be expressed in terms of perceived regularity. Our human visual system (HVS) uses the perceived regularity as one of the important pre-attentive cues in low-level image understanding. Similar to the HVS, image processing and computer vision systems can make fast and efficient decisions if they can quantify this regularity automatically. In this work, the problem of quantifying the degree of perceived regularity when looking at an arbitrary texture is introduced and addressed. One key contribution of this work is in proposing an objective no-reference perceptual texture regularity metric based on visual saliency. Other key contributions include an adaptive texture synthesis method based on texture regularity, and a low-complexity reduced-reference visual quality metric for assessing the quality of synthesized textures. In order to use the best performing visual attention model on textures, the performance of the most popular visual attention models to predict the visual saliency on textures is evaluated. Since there is no publicly available database with ground-truth saliency maps on images with exclusive texture content, a new eye-tracking database is systematically built. Using the Visual Saliency Map (VSM) generated by the best visual attention model, the proposed texture regularity metric is computed. The proposed metric is based on the observation that VSM characteristics differ between textures of differing regularity. The proposed texture regularity metric is based on two texture regularity scores, namely a textural similarity score and a spatial distribution score. In order to evaluate the performance of the proposed regularity metric, a texture regularity database called RegTEX, is built as a part of this work. It is shown through subjective testing that the proposed metric has a strong correlation with the Mean Opinion Score (MOS) for the perceived regularity of textures. The proposed method is also shown to be robust to geometric and photometric transformations and outperforms some of the popular texture regularity metrics in predicting the perceived regularity. The impact of the proposed metric to improve the performance of many image-processing applications is also presented. The influence of the perceived texture regularity on the perceptual quality of synthesized textures is demonstrated through building a synthesized textures database named SynTEX. It is shown through subjective testing that textures with different degrees of perceived regularities exhibit different degrees of vulnerability to artifacts resulting from different texture synthesis approaches. This work also proposes an algorithm for adaptively selecting the appropriate texture synthesis method based on the perceived regularity of the original texture. A reduced-reference texture quality metric for texture synthesis is also proposed as part of this work. The metric is based on the change in perceived regularity and the change in perceived granularity between the original and the synthesized textures. The perceived granularity is quantified through a new granularity metric that is proposed in this work. It is shown through subjective testing that the proposed quality metric, using just 2 parameters, has a strong correlation with the MOS for the fidelity of synthesized textures and outperforms the state-of-the-art full-reference quality metrics on 3 different texture databases. Finally, the ability of the proposed regularity metric in predicting the perceived degradation of textures due to compression and blur artifacts is also established.
ContributorsVaradarajan, Srenivas (Author) / Karam, Lina J (Thesis advisor) / Chakrabarti, Chaitali (Committee member) / Li, Baoxin (Committee member) / Tepedelenlioğlu, Cihan (Committee member) / Arizona State University (Publisher)
Created2014