This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 181
Filtering by

Clear all filters

151948-Thumbnail Image.png
Description
Smart home system (SHS) is a kind of information system aiming at realizing home automation. The SHS can connect with almost any kind of electronic/electric device used in a home so that they can be controlled and monitored centrally. Today's technology also allows the home owners to control and monitor

Smart home system (SHS) is a kind of information system aiming at realizing home automation. The SHS can connect with almost any kind of electronic/electric device used in a home so that they can be controlled and monitored centrally. Today's technology also allows the home owners to control and monitor the SHS installed in their homes remotely. This is typically realized by giving the SHS network access ability. Although the SHS's network access ability brings a lot of conveniences to the home owners, it also makes the SHS facing more security threats than ever before. As a result, when designing a SHS, the security threats it might face should be given careful considerations. System security threats can be solved properly by understanding them and knowing the parts in the system that should be protected against them first. This leads to the idea of solving the security threats a SHS might face from the requirements engineering level. Following this idea, this paper proposes a systematic approach to generate the security requirements specifications for the SHS. It can be viewed as the first step toward the complete SHS security requirements engineering process.
ContributorsXu, Rongcao (Author) / Ghazarian, Arbi (Thesis advisor) / Bansal, Ajay (Committee member) / Lindquist, Timothy (Committee member) / Arizona State University (Publisher)
Created2013
152796-Thumbnail Image.png
Description
The Internet is transforming its look, in a short span of time we have come very far from black and white web forms with plain buttons to responsive, colorful and appealing user interface elements. With the sudden rise in demand of web applications, developers are making full use of the

The Internet is transforming its look, in a short span of time we have come very far from black and white web forms with plain buttons to responsive, colorful and appealing user interface elements. With the sudden rise in demand of web applications, developers are making full use of the power of HTML5, JavaScript and CSS3 to cater to their users on various platforms. There was never a need of classifying the ways in which these languages can be interconnected to each other as the size of the front end code base was relatively small and did not involve critical business logic. This thesis focuses on listing and defining all dependencies between HTML5, JavaScript and CSS3 that will help developers better understand the interconnections within these languages. We also explore the present techniques available to a developer to make his code free of dependency related defects. We build a prototype tool, HJCDepend, based on our model, which aims at helping developers discover and remove defects early in the development cycle.
ContributorsVasugupta (Author) / Gary, Kevin (Thesis advisor) / Lindquist, Timothy (Committee member) / Bansal, Ajay (Committee member) / Arizona State University (Publisher)
Created2014
153540-Thumbnail Image.png
Description
In accordance with the Principal Agent Theory, Property Right Theory, Incentive Theory, and Human Capital Theory, firms face agency problems due to “separation of ownership and management”, which call for effective corporate governance. Ownership structure is a core element of the corporate governance. The differences in ownership structures thus may

In accordance with the Principal Agent Theory, Property Right Theory, Incentive Theory, and Human Capital Theory, firms face agency problems due to “separation of ownership and management”, which call for effective corporate governance. Ownership structure is a core element of the corporate governance. The differences in ownership structures thus may result in differential incentives in governance through the selection of senior management and in the design of senior management compensation system. This thesis investigates four firms with four different types of ownership structures: a public listed firm with the controlling interest by the state, a public listed firm with a non-state-owned controlling interest, a public listed firm a family-owned controlling interest, and a Sino-foreign joint venture firm. By using a case study approach, I focus on two dimensions of ownership structure characteristics – ownership diversification and differences in property rights so as to document whether there are systematic differences in governance participation and executive compensation design. Specifically, I focused on whether such differences are reflected in management selection (which is linked to adverse selection and moral hazard problems) and in compensation design (the choices of performance measurements, performance pay, and in stock option or restricted stock). The results are consistent with my expectation – the nature of ownership structure does affect senior management compensation design. Policy implications are discussed accordingly.
ContributorsGao, Shenghua (Author) / Pei, Ker-Wei (Thesis advisor) / Li, Feng (Committee member) / Shen, Wei (Committee member) / Arizona State University (Publisher)
Created2015
153545-Thumbnail Image.png
Description
For decades, microelectronics manufacturing has been concerned with failures related to electromigration phenomena in conductors experiencing high current densities. The influence of interconnect microstructure on device failures related to electromigration in BGA and flip chip solder interconnects has become a significant interest with reduced individual solder interconnect volumes. A survey

For decades, microelectronics manufacturing has been concerned with failures related to electromigration phenomena in conductors experiencing high current densities. The influence of interconnect microstructure on device failures related to electromigration in BGA and flip chip solder interconnects has become a significant interest with reduced individual solder interconnect volumes. A survey indicates that x-ray computed micro-tomography (µXCT) is an emerging, novel means for characterizing the microstructures' role in governing electromigration failures. This work details the design and construction of a lab-scale µXCT system to characterize electromigration in the Sn-0.7Cu lead-free solder system by leveraging in situ imaging.

In order to enhance the attenuation contrast observed in multi-phase material systems, a modeling approach has been developed to predict settings for the controllable imaging parameters which yield relatively high detection rates over the range of x-ray energies for which maximum attenuation contrast is expected in the polychromatic x-ray imaging system. In order to develop this predictive tool, a model has been constructed for the Bremsstrahlung spectrum of an x-ray tube, and calculations for the detector's efficiency over the relevant range of x-ray energies have been made, and the product of emitted and detected spectra has been used to calculate the effective x-ray imaging spectrum. An approach has also been established for filtering `zinger' noise in x-ray radiographs, which has proven problematic at high x-ray energies used for solder imaging. The performance of this filter has been compared with a known existing method and the results indicate a significant increase in the accuracy of zinger filtered radiographs.

The obtained results indicate the conception of a powerful means for the study of failure causing processes in solder systems used as interconnects in microelectronic packaging devices. These results include the volumetric quantification of parameters which are indicative of both electromigration tolerance of solders and the dominant mechanisms for atomic migration in response to current stressing. This work is aimed to further the community's understanding of failure-causing electromigration processes in industrially relevant material systems for microelectronic interconnect applications and to advance the capability of available characterization techniques for their interrogation.
ContributorsMertens, James Charles Edwin (Author) / Chawla, Nikhilesh (Thesis advisor) / Alford, Terry (Committee member) / Jiao, Yang (Committee member) / Neithalath, Narayanan (Committee member) / Arizona State University (Publisher)
Created2015
153099-Thumbnail Image.png
Description
In this dissertation, the results of our comprehensive computational studies of disordered jammed (i.e., mechanically stable) packings of hard particles are presented, including the family of superdisks in 2D and ellipsoids in 3D Euclidean space. Following a very brief introduction to the hard-particle systems, the event driven molecular dynamics (EDMD)

In this dissertation, the results of our comprehensive computational studies of disordered jammed (i.e., mechanically stable) packings of hard particles are presented, including the family of superdisks in 2D and ellipsoids in 3D Euclidean space. Following a very brief introduction to the hard-particle systems, the event driven molecular dynamics (EDMD) employed to generate the packing ensembles will be discussed. A large number of 2D packing configurations of superdisks are subsequently analyzed, through which a relatively accurate theoretical scheme for packing-fraction prediction based on local particle contact configurations is proposed and validated via additional numerical simulations. Moreover, the studies on binary ellipsoid packing in 3D are briefly discussed and the effects of different geometrical parameters on the final packing fraction are analyzed.
ContributorsXu, Yaopengxiao (Author) / Jiao, Yang (Thesis advisor) / Oswald, Jay (Committee member) / Liu, Yongming (Committee member) / Arizona State University (Publisher)
Created2014
153213-Thumbnail Image.png
Description
The processing of large volumes of RDF data require an efficient storage and query processing engine that can scale well with the volume of data. The initial attempts to address this issue focused on optimizing native RDF stores as well as conventional relational databases management systems. But as the

The processing of large volumes of RDF data require an efficient storage and query processing engine that can scale well with the volume of data. The initial attempts to address this issue focused on optimizing native RDF stores as well as conventional relational databases management systems. But as the volume of RDF data grew to exponential proportions, the limitations of these systems became apparent and researchers began to focus on using big data analysis tools, most notably Hadoop, to process RDF data. Various studies and benchmarks that evaluate these tools for RDF data processing have been published. In the past two and half years, however, heavy users of big data systems, like Facebook, noted limitations with the query performance of these big data systems and began to develop new distributed query engines for big data that do not rely on map-reduce. Facebook's Presto is one such example.

This thesis deals with evaluating the performance of Presto in processing big RDF data against Apache Hive. A comparative analysis was also conducted against 4store, a native RDF store. To evaluate the performance Presto for big RDF data processing, a map-reduce program and a compiler, based on Flex and Bison, were implemented. The map-reduce program loads RDF data into HDFS while the compiler translates SPARQL queries into a subset of SQL that Presto (and Hive) can understand. The evaluation was done on four and eight node Linux clusters installed on Microsoft Windows Azure platform with RDF datasets of size 10, 20, and 30 million triples. The results of the experiment show that Presto has a much higher performance than Hive can be used to process big RDF data. The thesis also proposes an architecture based on Presto, Presto-RDF, that can be used to process big RDF data.
ContributorsMammo, Mulugeta (Author) / Bansal, Srividya (Thesis advisor) / Bansal, Ajay (Committee member) / Lindquist, Timothy (Committee member) / Arizona State University (Publisher)
Created2014
149953-Thumbnail Image.png
Description
The theme for this work is the development of fast numerical algorithms for sparse optimization as well as their applications in medical imaging and source localization using sensor array processing. Due to the recently proposed theory of Compressive Sensing (CS), the $\ell_1$ minimization problem attracts more attention for its ability

The theme for this work is the development of fast numerical algorithms for sparse optimization as well as their applications in medical imaging and source localization using sensor array processing. Due to the recently proposed theory of Compressive Sensing (CS), the $\ell_1$ minimization problem attracts more attention for its ability to exploit sparsity. Traditional interior point methods encounter difficulties in computation for solving the CS applications. In the first part of this work, a fast algorithm based on the augmented Lagrangian method for solving the large-scale TV-$\ell_1$ regularized inverse problem is proposed. Specifically, by taking advantage of the separable structure, the original problem can be approximated via the sum of a series of simple functions with closed form solutions. A preconditioner for solving the block Toeplitz with Toeplitz block (BTTB) linear system is proposed to accelerate the computation. An in-depth discussion on the rate of convergence and the optimal parameter selection criteria is given. Numerical experiments are used to test the performance and the robustness of the proposed algorithm to a wide range of parameter values. Applications of the algorithm in magnetic resonance (MR) imaging and a comparison with other existing methods are included. The second part of this work is the application of the TV-$\ell_1$ model in source localization using sensor arrays. The array output is reformulated into a sparse waveform via an over-complete basis and study the $\ell_p$-norm properties in detecting the sparsity. An algorithm is proposed for minimizing a non-convex problem. According to the results of numerical experiments, the proposed algorithm with the aid of the $\ell_p$-norm can resolve closely distributed sources with higher accuracy than other existing methods.
ContributorsShen, Wei (Author) / Mittlemann, Hans D (Thesis advisor) / Renaut, Rosemary A. (Committee member) / Jackiewicz, Zdzislaw (Committee member) / Gelb, Anne (Committee member) / Ringhofer, Christian (Committee member) / Arizona State University (Publisher)
Created2011
150509-Thumbnail Image.png
Description
Gathering and managing software requirements, known as Requirement Engineering (RE), is a significant and basic step during the Software Development Life Cycle (SDLC). Any error or defect during the RE step will propagate to further steps of SDLC and resolving it will be more costly than any defect in other

Gathering and managing software requirements, known as Requirement Engineering (RE), is a significant and basic step during the Software Development Life Cycle (SDLC). Any error or defect during the RE step will propagate to further steps of SDLC and resolving it will be more costly than any defect in other steps. In order to produce better quality software, the requirements have to be free of any defects. Verification and Validation (V&V;) of requirements are performed to improve their quality, by performing the V&V; process on the Software Requirement Specification (SRS) document. V&V; of the software requirements focused to a specific domain helps in improving quality. A large database of software requirements from software projects of different domains is created. Software requirements from commercial applications are focus of this project; other domains embedded, mobile, E-commerce, etc. can be the focus of future efforts. The V&V; is done to inspect the requirements and improve the quality. Inspections are done to detect defects in the requirements and three approaches for inspection of software requirements are discussed; ad-hoc techniques, checklists, and scenario-based techniques. A more systematic domain-specific technique is presented for performing V&V; of requirements.
ContributorsChughtai, Rehman (Author) / Ghazarian, Arbi (Thesis advisor) / Bansal, Ajay (Committee member) / Millard, Bruce (Committee member) / Arizona State University (Publisher)
Created2012
154007-Thumbnail Image.png
Description
The study of deflagration to detonation transition (DDT) in explosives is of prime importance with regards to insensitive munitions (IM). Critical damage owing to thermal or shock stimuli could translate to significant loss of life and material. The present study models detonation and deflagration of a commonly used granular explosive:

The study of deflagration to detonation transition (DDT) in explosives is of prime importance with regards to insensitive munitions (IM). Critical damage owing to thermal or shock stimuli could translate to significant loss of life and material. The present study models detonation and deflagration of a commonly used granular explosive: cyclotetramethylene-tetranitramine, HMX. A robust literature review is followed by computational modeling of gas gun and DDT tube test data using the Sandia National Lab three-dimensional multi-material Eulerian hydrocode CTH. This dissertation proposes new computational practices and models that aid in predicting shock stimulus IM response. CTH was first used to model experimental data sets of DDT tubes from both Naval Surface Weapons Center and Los Alamos National Laboratory which were initiated by pyrogenic material and a piston, respectively. Analytical verification was performed, where possible, for detonation via empirical based equations at the Chapman Jouguet state with errors below 2.1%, and deflagration via pressure dependent burn rate equations. CTH simulations include inert, history variable reactive burn and Arrhenius models. The results are in excellent agreement with published HMX detonation velocities. Novel additions include accurate simulation of the pyrogenic material BKNO3 and the inclusion of porosity in energetic materials. The treatment of compaction is especially important in modeling precursory hotspots, caused by hydrodynamic collapse of void regions or grain interactions, prior to DDT of granular explosives. The CTH compaction model of HMX was verified within 11% error via a five pronged validation approach using gas gun data and employed use of a newly generated set of P-α parameters for granular HMX in a Mie-Gruneisen Equation of State. Next, the additions of compaction were extended to a volumetric surface burning model of HMX and compare well to a set of empirical burn rates. Lastly, the compendium of detonation and deflagration models was applied to the aforementioned DDT tubes and demonstrate working functionalities of all models, albeit at the expense of significant computational resources. A robust hydrocode methodology is proposed to make use of the deflagration, compaction and detonation models as a means to predict IM response to shock stimulus of granular explosive materials.
ContributorsMahon, Kelly Susan (Author) / Lee, Taewoo (Thesis advisor) / Herrmann, Marcus (Committee member) / Chen, Kangping (Committee member) / Jiao, Yang (Committee member) / Huang, Huei-Ping (Committee member) / Arizona State University (Publisher)
Created2015
156115-Thumbnail Image.png
Description
Materials with unprecedented properties are necessary to make dramatic changes in current and future aerospace platforms. Hybrid materials and composites are increasingly being used in aircraft and spacecraft frames; however, future platforms will require an optimal design of novel materials that enable operation in a variety of environments and produce

Materials with unprecedented properties are necessary to make dramatic changes in current and future aerospace platforms. Hybrid materials and composites are increasingly being used in aircraft and spacecraft frames; however, future platforms will require an optimal design of novel materials that enable operation in a variety of environments and produce known/predicted damage mechanisms. Nanocomposites and nanoengineered composites with CNTs have the potential to make significant improvements in strength, stiffness, fracture toughness, flame retardancy and resistance to corrosion. Therefore, these materials have generated tremendous scientific and technical interest over the past decade and various architectures are being explored for applications to light-weight airframe structures. However, the success of such materials with significantly improved performance metrics requires careful control of the parameters during synthesis and processing. Their implementation is also limited due to the lack of complete understanding of the effects the nanoparticles impart to the bulk properties of composites. It is common for computational methods to be applied to explain phenomena measured or observed experimentally. Frequently, a given phenomenon or material property is only considered to be fully understood when the associated physics has been identified through accompanying calculations or simulations.

The computationally and experimentally integrated research presented in this dissertation provides improved understanding of the mechanical behavior and response including damage and failure in CNT nanocomposites, enhancing confidence in their applications. The computations at the atomistic level helps to understand the underlying mechanochemistry and allow a systematic investigation of the complex CNT architectures and the material performance across a wide range of parameters. Simulation of the bond breakage phenomena and development of the interface to continuum scale damage captures the effects of applied loading and damage precursor and provides insight into the safety of nanoengineered composites under service loads. The validated modeling methodology is expected to be a step in the direction of computationally-assisted design and certification of novel materials, thus liberating the pace of their implementation in future applications.
ContributorsSubramanian, Nithya (Author) / Chattopadhyay, Aditi (Thesis advisor) / Dai, Lenore (Committee member) / Jiao, Yang (Committee member) / Liu, Yongming (Committee member) / Rajadas, John (Committee member) / Arizona State University (Publisher)
Created2018