This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 150
151718-Thumbnail Image.png
Description
The increasing popularity of Twitter renders improved trustworthiness and relevance assessment of tweets much more important for search. However, given the limitations on the size of tweets, it is hard to extract measures for ranking from the tweet's content alone. I propose a method of ranking tweets by generating a

The increasing popularity of Twitter renders improved trustworthiness and relevance assessment of tweets much more important for search. However, given the limitations on the size of tweets, it is hard to extract measures for ranking from the tweet's content alone. I propose a method of ranking tweets by generating a reputation score for each tweet that is based not just on content, but also additional information from the Twitter ecosystem that consists of users, tweets, and the web pages that tweets link to. This information is obtained by modeling the Twitter ecosystem as a three-layer graph. The reputation score is used to power two novel methods of ranking tweets by propagating the reputation over an agreement graph based on tweets' content similarity. Additionally, I show how the agreement graph helps counter tweet spam. An evaluation of my method on 16~million tweets from the TREC 2011 Microblog Dataset shows that it doubles the precision over baseline Twitter Search and achieves higher precision than current state of the art method. I present a detailed internal empirical evaluation of RAProp in comparison to several alternative approaches proposed by me, as well as external evaluation in comparison to the current state of the art method.
ContributorsRavikumar, Srijith (Author) / Kambhampati, Subbarao (Thesis advisor) / Davulcu, Hasan (Committee member) / Liu, Huan (Committee member) / Arizona State University (Publisher)
Created2013
152236-Thumbnail Image.png
Description
Continuous Delivery, as one of the youngest and most popular member of agile model family, has become a popular concept and method in software development industry recently. Instead of the traditional software development method, which requirements and solutions must be fixed before starting software developing, it promotes adaptive planning, evolutionary

Continuous Delivery, as one of the youngest and most popular member of agile model family, has become a popular concept and method in software development industry recently. Instead of the traditional software development method, which requirements and solutions must be fixed before starting software developing, it promotes adaptive planning, evolutionary development and delivery, and encourages rapid and flexible response to change. However, several problems prevent Continuous Delivery to be introduced into education world. Taking into the consideration of the barriers, we propose a new Cloud based Continuous Delivery Software Developing System. This system is designed to fully utilize the whole life circle of software developing according to Continuous Delivery concepts in a virtualized environment in Vlab platform.
ContributorsDeng, Yuli (Author) / Huang, Dijiang (Thesis advisor) / Davulcu, Hasan (Committee member) / Chen, Yinong (Committee member) / Arizona State University (Publisher)
Created2013
152086-Thumbnail Image.png
Description
The ribosome is a ribozyme and central to the biosynthesis of proteins in all organisms. It has a strong bias against non-alpha-L-amino acids, such as alpha-D-amino acids and beta-amino acids. Additionally, the ribosome is only able to incorporate one amino acid in response to one codon. It has been demonstrated

The ribosome is a ribozyme and central to the biosynthesis of proteins in all organisms. It has a strong bias against non-alpha-L-amino acids, such as alpha-D-amino acids and beta-amino acids. Additionally, the ribosome is only able to incorporate one amino acid in response to one codon. It has been demonstrated that reengineering of the peptidyltransferase center (PTC) of the ribosome enabled the incorporation of both alpha-D-amino acids and beta-amino acids into full length protein. Described in Chapter 2 are five modified ribosomes having modifications in the peptidyltrasnferase center in the 23S rRNA. These modified ribosomes successfully incorporated five different beta-amino acids (2.1 - 2.5) into E. coli dihydrofolate reductase (DHFR). The second project (Chapter 3) focused on the study of the modified ribosomes facilitating the incorporation of the dipeptide glycylphenylalanine (3.25) and fluorescent dipeptidomimetic 3.26 into DHFR. These ribosomes also had modifications in the peptidyltransferase center in the 23S rRNA of the 50S ribosomal subunit. The modified DHFRs having beta-amino acids 2.3 and 2.5, dipeptide glycylphenylalanine (3.25) and dipeptidomimetic 3.26 were successfully characterized by the MALDI-MS analysis of the peptide fragments produced by "in-gel" trypsin digestion of the modified proteins. The fluorescent spectra of the dipeptidomimetic 3.26 and modified DHFR having fluorescent dipeptidomimetic 3.26 were also measured. The type I and II DNA topoisomerases have been firmly established as effective molecular targets for many antitumor drugs. A "classical" topoisomerase I or II poison acts by misaligning the free hydroxyl group of the sugar moiety of DNA and preventing the reverse transesterfication reaction to religate DNA. There have been only two classes of compounds, saintopin and topopyrones, reported as dual topoisomerase I and II poisons. Chapter 4 describes the synthesis and biological evaluation of topopyrones. Compound 4.10, employed at 20 µM, was as efficient as 0.5 uM camptothecin, a potent topoisomerase I poison, in stabilizing the covalent binary complex (~30%). When compared with a known topoisomerase II poison, etoposide (at 0.5 uM), topopyorone 4.10 produced similar levels of stabilized DNA-enzyme binary complex (~34%) at 5 uM concentration.
ContributorsMaini, Rumit (Author) / Hecht, Sidney M. (Thesis advisor) / Gould, Ian (Committee member) / Yan, Hao (Committee member) / Arizona State University (Publisher)
Created2013
152090-Thumbnail Image.png
Description
Photosynthesis, one of the most important processes in nature, has provided an energy basis for nearly all life on Earth, as well as the fossil fuels we use today to power modern society. This research aims to mimic the photosynthetic process of converting incident solar energy into chemical potential energy

Photosynthesis, one of the most important processes in nature, has provided an energy basis for nearly all life on Earth, as well as the fossil fuels we use today to power modern society. This research aims to mimic the photosynthetic process of converting incident solar energy into chemical potential energy in the form of a fuel via systems capable of carrying out photo-induced electron transfer to drive the production of hydrogen from water. Herein is detailed progress in using photo-induced stepwise electron transfer to drive the oxidation of water and reduction of protons to hydrogen. In the design, use of more blue absorbing porphyrin dyes to generate high-potential intermediates for oxidizing water and more red absorbing phthalocyanine dyes for forming the low potential charge needed for the production of hydrogen have been utilized. For investigating water oxidation at the photoanode, high potential porphyrins such as, bis-pyridyl porphyrins and pentafluorophenyl porphyrins have been synthesized and experiments have aimed at the co-immobilization of this dye with an IrO2-nH2O catalyst on TiO2. To drive the cathodic reaction of the water splitting photoelectrochemical cell, utilization of silicon octabutoxy-phthalocyanines have been explored, as they offer good absorption in the red to near infrared, coupled with low potential photo-excited states. Axially and peripherally substituted phthalocyanines bearing carboxylic anchoring groups for the immobilization on semiconductors such as TiO2 has been investigated. Ultimately, this work should culminate in a photoelectrochemical cell capable of splitting water to oxygen and hydrogen with the only energy input from light. A series of perylene dyes bearing multiple semi-conducting metal oxide anchoring groups have been synthesized and studied. Results have shown interfacial electron transfer between these perylenes and TiO2 nanoparticles encapsulated within reverse micelles and naked nanoparticles. The binding process was followed by monitoring the hypsochromic shift of the dye absorption spectra over time. Photoinduced electron transfer from the singlet excited state of the perylenes to the TiO2 conduction band is indicated by emission quenching of the TiO2-bound form of the dyes and confirmed by transient absorption measurements of the radical cation of the dyes and free carriers (injected electrons) in the TiO2.
ContributorsBergkamp, Jesse J (Author) / Moore, Ana L (Thesis advisor) / Mariño-Ochoa, Ernesto (Thesis advisor) / Gust, Devens J (Committee member) / Gould, Ian (Committee member) / Arizona State University (Publisher)
Created2013
152245-Thumbnail Image.png
Description
The biological and chemical diversity of protein structure and function can be greatly expanded by position-specific incorporation of non-natural amino acids bearing a variety of functional groups. Non-cognate amino acids can be incorporated into proteins at specific sites by using orthogonal aminoacyl-tRNA synthetase/tRNA pairs in conjunction with nonsense, rare, or

The biological and chemical diversity of protein structure and function can be greatly expanded by position-specific incorporation of non-natural amino acids bearing a variety of functional groups. Non-cognate amino acids can be incorporated into proteins at specific sites by using orthogonal aminoacyl-tRNA synthetase/tRNA pairs in conjunction with nonsense, rare, or 4-bp codons. There has been considerable progress in developing new types of amino acids, in identifying novel methods of tRNA aminoacylation, and in expanding the genetic code to direct their position. Chemical aminoacylation of tRNAs is accomplished by acylation and ligation of a dinucleotide (pdCpA) to the 3'-terminus of truncated tRNA. This strategy allows the incorporation of a wide range of natural and unnatural amino acids into pre-determined sites, thereby facilitating the study of structure-function relationships in proteins and allowing the investigation of their biological, biochemical and biophysical properties. Described in Chapter 1 is the current methodology for synthesizing aminoacylated suppressor tRNAs. Aminoacylated suppressor tRNACUAs are typically prepared by linking pre-aminoacylated dinucleotides (aminoacyl-pdCpAs) to 74 nucleotide (nt) truncated tRNAs (tRNA-COH) via a T4 RNA ligase mediated reaction. Alternatively, there is another route outlined in Chapter 1 that utilizes a different pre-aminoacylated dinucleotide, AppA. This dinucleotide has been shown to be a suitable substrate for T4 RNA ligase mediated coupling with abbreviated tRNA-COHs for production of 76 nt aminoacyl-tRNACUAs. The synthesized suppressor tRNAs have been shown to participate in protein synthesis in vitro, in an S30 (E. coli) coupled transcription-translation system in which there is a UAG codon in the mRNA at the position corresponding to Val10. Chapter 2 describes the synthesis of two non-proteinogenic amino acids, L-thiothreonine and L-allo-thiothreonine, and their incorporation into predetermined positions of a catalytically competent dihydrofolate reductase (DHFR) analogue lacking cysteine. Here, the elaborated proteins were site-specifically derivitized with a fluorophore at the thiothreonine residue. The synthesis and incorporation of phosphorotyrosine derivatives into DHFR is illustrated in Chapter 3. Three different phosphorylated tyrosine derivatives were prepared: bis-nitrobenzylphosphoro-L-tyrosine, nitrobenzylphosphoro-L-tyrosine, and phosphoro-L-tyrosine. Their ability to participate in a protein synthesis system was also evaluated.
ContributorsNangreave, Ryan Christopher (Author) / Hecht, Sidney M. (Thesis advisor) / Yan, Hao (Committee member) / Gould, Ian (Committee member) / Arizona State University (Publisher)
Created2013
151371-Thumbnail Image.png
Description
This dissertation presents the Temporal Event Query Language (TEQL), a new language for querying event streams. Event Stream Processing enables online querying of streams of events to extract relevant data in a timely manner. TEQL enables querying of interval-based event streams using temporal database operators. Temporal databases and temporal query

This dissertation presents the Temporal Event Query Language (TEQL), a new language for querying event streams. Event Stream Processing enables online querying of streams of events to extract relevant data in a timely manner. TEQL enables querying of interval-based event streams using temporal database operators. Temporal databases and temporal query languages have been a subject of research for more than 30 years and are a natural fit for expressing queries that involve a temporal dimension. However, operators developed in this context cannot be directly applied to event streams. The research extends a preexisting relational framework for event stream processing to support temporal queries. The language features and formal semantic extensions to extend the relational framework are identified. The extended framework supports continuous, step-wise evaluation of temporal queries. The incremental evaluation of TEQL operators is formalized to avoid re-computation of previous results. The research includes the development of a prototype that supports the integrated event and temporal query processing framework, with support for incremental evaluation and materialization of intermediate results. TEQL enables reporting temporal data in the output, direct specification of conditions over timestamps, and specification of temporal relational operators. Through the integration of temporal database operators with event languages, a new class of temporal queries is made possible for querying event streams. New features include semantic aggregation, extraction of temporal patterns using set operators, and a more accurate specification of event co-occurrence.
ContributorsShiva, Foruhar Ali (Author) / Urban, Susan D (Thesis advisor) / Chen, Yi (Thesis advisor) / Davulcu, Hasan (Committee member) / Sarjoughian, Hessam S. (Committee member) / Arizona State University (Publisher)
Created2012
Description
As the genetic information storage vehicle, deoxyribonucleic acid (DNA) molecules are essential to all known living organisms and many viruses. It is amazing that such a large amount of information about how life develops can be stored in these tiny molecules. Countless scientists, especially some biologists, are trying to decipher

As the genetic information storage vehicle, deoxyribonucleic acid (DNA) molecules are essential to all known living organisms and many viruses. It is amazing that such a large amount of information about how life develops can be stored in these tiny molecules. Countless scientists, especially some biologists, are trying to decipher the genetic information stored in these captivating molecules. Meanwhile, another group of researchers, nanotechnologists in particular, have discovered that the unique and concise structural features of DNA together with its information coding ability can be utilized for nano-construction efforts. This idea culminated in the birth of the field of DNA nanotechnology which is the main topic of this dissertation. The ability of rationally designed DNA strands to self-assemble into arbitrary nanostructures without external direction is the basis of this field. A series of novel design principles for DNA nanotechnology are presented here, from topological DNA nanostructures to complex and curved DNA nanostructures, from pure DNA nanostructures to hybrid RNA/DNA nanostructures. As one of the most important and pioneering fields in controlling the assembly of materials (both DNA and other materials) at the nanoscale, DNA nanotechnology is developing at a dramatic speed and as more and more construction approaches are invented, exciting advances will emerge in ways that we may or may not predict.
ContributorsHan, Dongran (Author) / Yan, Hao (Thesis advisor) / Liu, Yan (Thesis advisor) / Ros, Anexandra (Committee member) / Gould, Ian (Committee member) / Arizona State University (Publisher)
Created2012
151275-Thumbnail Image.png
Description
The pay-as-you-go economic model of cloud computing increases the visibility, traceability, and verifiability of software costs. Application developers must understand how their software uses resources when running in the cloud in order to stay within budgeted costs and/or produce expected profits. Cloud computing's unique economic model also leads naturally to

The pay-as-you-go economic model of cloud computing increases the visibility, traceability, and verifiability of software costs. Application developers must understand how their software uses resources when running in the cloud in order to stay within budgeted costs and/or produce expected profits. Cloud computing's unique economic model also leads naturally to an earn-as-you-go profit model for many cloud based applications. These applications can benefit from low level analyses for cost optimization and verification. Testing cloud applications to ensure they meet monetary cost objectives has not been well explored in the current literature. When considering revenues and costs for cloud applications, the resource economic model can be scaled down to the transaction level in order to associate source code with costs incurred while running in the cloud. Both static and dynamic analysis techniques can be developed and applied to understand how and where cloud applications incur costs. Such analyses can help optimize (i.e. minimize) costs and verify that they stay within expected tolerances. An adaptation of Worst Case Execution Time (WCET) analysis is presented here to statically determine worst case monetary costs of cloud applications. This analysis is used to produce an algorithm for determining control flow paths within an application that can exceed a given cost threshold. The corresponding results are used to identify path sections that contribute most to cost excess. A hybrid approach for determining cost excesses is also presented that is comprised mostly of dynamic measurements but that also incorporates calculations that are based on the static analysis approach. This approach uses operational profiles to increase the precision and usefulness of the calculations.
ContributorsBuell, Kevin, Ph.D (Author) / Collofello, James (Thesis advisor) / Davulcu, Hasan (Committee member) / Lindquist, Timothy (Committee member) / Sen, Arunabha (Committee member) / Arizona State University (Publisher)
Created2012
151467-Thumbnail Image.png
Description
A semiconductor supply chain modeling and simulation platform using Linear Program (LP) optimization and parallel Discrete Event System Specification (DEVS) process models has been developed in a joint effort by ASU and Intel Corporation. A Knowledge Interchange Broker (KIBDEVS/LP) was developed to broker information synchronously between the DEVS and LP

A semiconductor supply chain modeling and simulation platform using Linear Program (LP) optimization and parallel Discrete Event System Specification (DEVS) process models has been developed in a joint effort by ASU and Intel Corporation. A Knowledge Interchange Broker (KIBDEVS/LP) was developed to broker information synchronously between the DEVS and LP models. Recently a single-echelon heuristic Inventory Strategy Module (ISM) was added to correct for forecast bias in customer demand data using different smoothing techniques. The optimization model could then use information provided by the forecast model to make better decisions for the process model. The composition of ISM with LP and DEVS models resulted in the first realization of what is now called the Optimization Simulation Forecast (OSF) platform. It could handle a single echelon supply chain system consisting of single hubs and single products In this thesis, this single-echelon simulation platform is extended to handle multiple echelons with multiple inventory elements handling multiple products. The main aspect for the multi-echelon OSF platform was to extend the KIBDEVS/LP such that ISM interactions with the LP and DEVS models could also be supported. To achieve this, a new, scalable XML schema for the KIB has been developed. The XML schema has also resulted in strengthening the KIB execution engine design. A sequential scheme controls the executions of the DEVS-Suite simulator, CPLEX optimizer, and ISM engine. To use the ISM for multiple echelons, it is extended to compute forecast customer demands and safety stocks over multiple hubs and products. Basic examples for semiconductor manufacturing spanning single and two echelon supply chain systems have been developed and analyzed. Experiments using perfect data were conducted to show the correctness of the OSF platform design and implementation. Simple, but realistic experiments have also been conducted. They highlight the kinds of supply chain dynamics that can be evaluated using discrete event process simulation, linear programming optimization, and heuristics forecasting models.
ContributorsSmith, James Melkon (Author) / Sarjoughian, Hessam S. (Thesis advisor) / Davulcu, Hasan (Committee member) / Fainekos, Georgios (Committee member) / Arizona State University (Publisher)
Created2012
151524-Thumbnail Image.png
Description
Process migration is a heavily studied research area and has a number of applications in distributed systems. Process migration means transferring a process running on one machine to another such that it resumes execution from the point at which it was suspended. The conventional approach to implement process migration is

Process migration is a heavily studied research area and has a number of applications in distributed systems. Process migration means transferring a process running on one machine to another such that it resumes execution from the point at which it was suspended. The conventional approach to implement process migration is to move the entire state information of the process (including hardware context, virtual memory, files etc.) from one machine to another. Copying all the state information is costly. This thesis proposes and demonstrates a new approach of migrating a process between two cores of Intel Single Chip Cloud (SCC), an experimental 48-core processor by Intel, with each core running a separate instance of the operating system. In this method the amount of process state to be transferred from one core's memory to another is reduced by making use of special registers called Lookup tables (LUTs) present on each core of SCC. Thus this new approach is faster than the conventional method.
ContributorsJain, Vaibhav (Author) / Dasgupta, Partha (Thesis advisor) / Shriavstava, Aviral (Committee member) / Davulcu, Hasan (Committee member) / Arizona State University (Publisher)
Created2013