This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 87
151889-Thumbnail Image.png
Description
This dissertation explores the use of bench-scale batch microcosms in remedial design of contaminated aquifers, presents an alternative methodology for conducting such treatability studies, and - from technical, economical, and social perspectives - examines real-world application of this new technology. In situ bioremediation (ISB) is an effective remedial approach for

This dissertation explores the use of bench-scale batch microcosms in remedial design of contaminated aquifers, presents an alternative methodology for conducting such treatability studies, and - from technical, economical, and social perspectives - examines real-world application of this new technology. In situ bioremediation (ISB) is an effective remedial approach for many contaminated groundwater sites. However, site-specific variability necessitates the performance of small-scale treatability studies prior to full-scale implementation. The most common methodology is the batch microcosm, whose potential limitations and suitable technical alternatives are explored in this thesis. In a critical literature review, I discuss how continuous-flow conditions stimulate microbial attachment and biofilm formation, and identify unique microbiological phenomena largely absent in batch bottles, yet potentially relevant to contaminant fate. Following up on this theoretical evaluation, I experimentally produce pyrosequencing data and perform beta diversity analysis to demonstrate that batch and continuous-flow (column) microcosms foster distinctly different microbial communities. Next, I introduce the In Situ Microcosm Array (ISMA), which took approximately two years to design, develop, build and iteratively improve. The ISMA can be deployed down-hole in groundwater monitoring wells of contaminated aquifers for the purpose of autonomously conducting multiple parallel continuous-flow treatability experiments. The ISMA stores all sample generated in the course of each experiment, thereby preventing the release of chemicals into the environment. Detailed results are presented from an ISMA demonstration evaluating ISB for the treatment of hexavalent chromium and trichloroethene. In a technical and economical comparison to batch microcosms, I demonstrate the ISMA is both effective in informing remedial design decisions and cost-competitive. Finally, I report on a participatory technology assessment (pTA) workshop attended by diverse stakeholders of the Phoenix 52nd Street Superfund Site evaluating the ISMA's ability for addressing a real-world problem. In addition to receiving valuable feedback on perceived ISMA limitations, I conclude from the workshop that pTA can facilitate mutual learning even among entrenched stakeholders. In summary, my doctoral research (i) pinpointed limitations of current remedial design approaches, (ii) produced a novel alternative approach, and (iii) demonstrated the technical, economical and social value of this novel remedial design tool, i.e., the In Situ Microcosm Array technology.
ContributorsKalinowski, Tomasz (Author) / Halden, Rolf U. (Thesis advisor) / Johnson, Paul C (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Bennett, Ira (Committee member) / Arizona State University (Publisher)
Created2013
151718-Thumbnail Image.png
Description
The increasing popularity of Twitter renders improved trustworthiness and relevance assessment of tweets much more important for search. However, given the limitations on the size of tweets, it is hard to extract measures for ranking from the tweet's content alone. I propose a method of ranking tweets by generating a

The increasing popularity of Twitter renders improved trustworthiness and relevance assessment of tweets much more important for search. However, given the limitations on the size of tweets, it is hard to extract measures for ranking from the tweet's content alone. I propose a method of ranking tweets by generating a reputation score for each tweet that is based not just on content, but also additional information from the Twitter ecosystem that consists of users, tweets, and the web pages that tweets link to. This information is obtained by modeling the Twitter ecosystem as a three-layer graph. The reputation score is used to power two novel methods of ranking tweets by propagating the reputation over an agreement graph based on tweets' content similarity. Additionally, I show how the agreement graph helps counter tweet spam. An evaluation of my method on 16~million tweets from the TREC 2011 Microblog Dataset shows that it doubles the precision over baseline Twitter Search and achieves higher precision than current state of the art method. I present a detailed internal empirical evaluation of RAProp in comparison to several alternative approaches proposed by me, as well as external evaluation in comparison to the current state of the art method.
ContributorsRavikumar, Srijith (Author) / Kambhampati, Subbarao (Thesis advisor) / Davulcu, Hasan (Committee member) / Liu, Huan (Committee member) / Arizona State University (Publisher)
Created2013
151867-Thumbnail Image.png
Description
Automating aspects of biocuration through biomedical information extraction could significantly impact biomedical research by enabling greater biocuration throughput and improving the feasibility of a wider scope. An important step in biomedical information extraction systems is named entity recognition (NER), where mentions of entities such as proteins and diseases are located

Automating aspects of biocuration through biomedical information extraction could significantly impact biomedical research by enabling greater biocuration throughput and improving the feasibility of a wider scope. An important step in biomedical information extraction systems is named entity recognition (NER), where mentions of entities such as proteins and diseases are located within natural-language text and their semantic type is determined. This step is critical for later tasks in an information extraction pipeline, including normalization and relationship extraction. BANNER is a benchmark biomedical NER system using linear-chain conditional random fields and the rich feature set approach. A case study with BANNER locating genes and proteins in biomedical literature is described. The first corpus for disease NER adequate for use as training data is introduced, and employed in a case study of disease NER. The first corpus locating adverse drug reactions (ADRs) in user posts to a health-related social website is also described, and a system to locate and identify ADRs in social media text is created and evaluated. The rich feature set approach to creating NER feature sets is argued to be subject to diminishing returns, implying that additional improvements may require more sophisticated methods for creating the feature set. This motivates the first application of multivariate feature selection with filters and false discovery rate analysis to biomedical NER, resulting in a feature set at least 3 orders of magnitude smaller than the set created by the rich feature set approach. Finally, two novel approaches to NER by modeling the semantics of token sequences are introduced. The first method focuses on the sequence content by using language models to determine whether a sequence resembles entries in a lexicon of entity names or text from an unlabeled corpus more closely. The second method models the distributional semantics of token sequences, determining the similarity between a potential mention and the token sequences from the training data by analyzing the contexts where each sequence appears in a large unlabeled corpus. The second method is shown to improve the performance of BANNER on multiple data sets.
ContributorsLeaman, James Robert (Author) / Gonzalez, Graciela (Thesis advisor) / Baral, Chitta (Thesis advisor) / Cohen, Kevin B (Committee member) / Liu, Huan (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2013
151784-Thumbnail Image.png
Description
This work focuses on a generalized assessment of source zone natural attenuation (SZNA) at chlorinated aliphatic hydrocarbon (CAH) impacted sites. Given the numbers of sites and technical challenges for cleanup there is a need for a SZNA method at CAH impacted sites. The method anticipates that decision makers will be

This work focuses on a generalized assessment of source zone natural attenuation (SZNA) at chlorinated aliphatic hydrocarbon (CAH) impacted sites. Given the numbers of sites and technical challenges for cleanup there is a need for a SZNA method at CAH impacted sites. The method anticipates that decision makers will be interested in the following questions: 1-Is SZNA occurring and what processes contribute? 2-What are the current SZNA rates? 3-What are the longer-term implications? The approach is macroscopic and uses multiple lines-of-evidence. An in-depth application of the generalized non-site specific method over multiple site events, with sampling refinement approaches applied for improving SZNA estimates, at three CAH impacted sites is presented with a focus on discharge rates for four events over approximately three years (Site 1:2.9, 8.4, 4.9, 2.8kg/yr as PCE, Site 2:1.6, 2.2, 1.7, 1.1kg/y as PCE, Site 3:570, 590, 250, 240kg/y as TCE). When applying the generalized CAH-SZNA method, it is likely that different practitioners will not sample a site similarly, especially regarding sampling density on a groundwater transect. Calculation of SZNA rates is affected by contaminant spatial variability with reference to transect sampling intervals and density with variations in either resulting in different mass discharge estimates. The effects on discharge estimates from varied sampling densities and spacings were examined to develop heuristic sampling guidelines with practical site sampling densities; the guidelines aim to reduce the variability in discharge estimates due to different sampling approaches and to improve confidence in SZNA rates allowing decision-makers to place the rates in perspective and determine a course of action based on remedial goals. Finally bench scale testing was used to address longer term questions; specifically the nature and extent of source architecture. A rapid in-situ disturbance method was developed using a bench-scale apparatus. The approach allows for rapid identification of the presence of DNAPL using several common pilot scale technologies (ISCO, air-sparging, water-injection) and can identify relevant source architectural features (ganglia, pools, dissolved source). Understanding of source architecture and identification of DNAPL containing regions greatly enhances site conceptualization models, improving estimated time frames for SZNA, and possibly improving design of remedial systems.
ContributorsEkre, Ryan (Author) / Johnson, Paul Carr (Thesis advisor) / Rittmann, Bruce (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Arizona State University (Publisher)
Created2013
152004-Thumbnail Image.png
Description
To further the efforts producing energy from more renewable sources, microbial electrochemical cells (MXCs) can utilize anode respiring bacteria (ARB) to couple the oxidation of an organic substrate to the delivery of electrons to the anode. Although ARB such as Geobacter and Shewanella have been well-studied in terms of their

To further the efforts producing energy from more renewable sources, microbial electrochemical cells (MXCs) can utilize anode respiring bacteria (ARB) to couple the oxidation of an organic substrate to the delivery of electrons to the anode. Although ARB such as Geobacter and Shewanella have been well-studied in terms of their microbiology and electrochemistry, much is still unknown about the mechanism of electron transfer to the anode. To this end, this thesis seeks to elucidate the complexities of electron transfer existing in Geobacter sulfurreducens biofilms by employing Electrochemical Impedance Spectroscopy (EIS) as the tool of choice. Experiments measuring EIS resistances as a function of growth were used to uncover the potential gradients that emerge in biofilms as they grow and become thicker. While a better understanding of this model ARB is sought, electrochemical characterization of a halophile, Geoalkalibacter subterraneus (Glk. subterraneus), revealed that this organism can function as an ARB and produce seemingly high current densities while consuming different organic substrates, including acetate, butyrate, and glycerol. The importance of identifying and studying novel ARB for broader MXC applications was stressed in this thesis as a potential avenue for tackling some of human energy problems.
ContributorsAjulo, Oluyomi (Author) / Torres, Cesar (Thesis advisor) / Nielsen, David (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Popat, Sudeep (Committee member) / Arizona State University (Publisher)
Created2013
151669-Thumbnail Image.png
Description
In situ remediation of contaminated aquifers, specifically in situ bioremediation (ISB), has gained popularity over pump-and-treat operations. It represents a more sustainable approach that can also achieve complete mineralization of contaminants in the subsurface. However, the subsurface reality is very complex, characterized by hydrodynamic groundwater movement, geological heterogeneity, and mass-transfer

In situ remediation of contaminated aquifers, specifically in situ bioremediation (ISB), has gained popularity over pump-and-treat operations. It represents a more sustainable approach that can also achieve complete mineralization of contaminants in the subsurface. However, the subsurface reality is very complex, characterized by hydrodynamic groundwater movement, geological heterogeneity, and mass-transfer phenomena governing contaminant transport and bioavailability. These phenomena cannot be properly studied using commonly conducted laboratory batch microcosms lacking realistic representation of the processes named above. Instead, relevant processes are better understood by using flow-through systems (sediment columns). However, flow-through column studies are typically conducted without replicates. Due to additional sources of variability (e.g., flow rate variation between columns and over time), column studies are expected to be less reproducible than simple batch microcosms. This was assessed through a comprehensive statistical analysis of results from multiple batch and column studies. Anaerobic microbial biotransformations of trichloroethene and of perchlorate were chosen as case studies. Results revealed that no statistically significant differences were found between reproducibility of batch and column studies. It has further been recognized that laboratory studies cannot accurately reproduce many phenomena encountered in the field. To overcome this limitation, a down-hole diagnostic device (in situ microcosm array - ISMA) was developed, that enables the autonomous operation of replicate flow-through sediment columns in a realistic aquifer setting. Computer-aided design (CAD), rapid prototyping, and computer numerical control (CNC) machining were used to create a tubular device enabling practitioners to conduct conventional sediment column studies in situ. A case study where two remediation strategies, monitored natural attenuation and bioaugmentation with concomitant biostimulation, were evaluated in the laboratory and in situ at a perchlorate-contaminated site. Findings demonstrate the feasibility of evaluating anaerobic bioremediation in a moderately aerobic aquifer. They further highlight the possibility of mimicking in situ remediation strategies on the small-scale in situ. The ISMA is the first device offering autonomous in situ operation of conventional flow-through sediment microcosms and producing statistically significant data through the use of multiple replicates. With its sustainable approach to treatability testing and data gathering, the ISMA represents a versatile addition to the toolbox of scientists and engineers.
ContributorsMcClellan, Kristin (Author) / Halden, Rolf U. (Thesis advisor) / Johnson, Paul C (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Arizona State University (Publisher)
Created2013
151517-Thumbnail Image.png
Description
Data mining is increasing in importance in solving a variety of industry problems. Our initiative involves the estimation of resource requirements by skill set for future projects by mining and analyzing actual resource consumption data from past projects in the semiconductor industry. To achieve this goal we face difficulties like

Data mining is increasing in importance in solving a variety of industry problems. Our initiative involves the estimation of resource requirements by skill set for future projects by mining and analyzing actual resource consumption data from past projects in the semiconductor industry. To achieve this goal we face difficulties like data with relevant consumption information but stored in different format and insufficient data about project attributes to interpret consumption data. Our first goal is to clean the historical data and organize it into meaningful structures for analysis. Once the preprocessing on data is completed, different data mining techniques like clustering is applied to find projects which involve resources of similar skillsets and which involve similar complexities and size. This results in "resource utilization templates" for groups of related projects from a resource consumption perspective. Then project characteristics are identified which generate this diversity in headcounts and skillsets. These characteristics are not currently contained in the data base and are elicited from the managers of historical projects. This represents an opportunity to improve the usefulness of the data collection system for the future. The ultimate goal is to match the product technical features with the resource requirement for projects in the past as a model to forecast resource requirements by skill set for future projects. The forecasting model is developed using linear regression with cross validation of the training data as the past project execution are relatively few in number. Acceptable levels of forecast accuracy are achieved relative to human experts' results and the tool is applied to forecast some future projects' resource demand.
ContributorsBhattacharya, Indrani (Author) / Sen, Arunabha (Thesis advisor) / Kempf, Karl G. (Thesis advisor) / Liu, Huan (Committee member) / Arizona State University (Publisher)
Created2013
152541-Thumbnail Image.png
Description
Contemporary online social platforms present individuals with social signals in the form of news feed on their peers' activities. On networks such as Facebook, Quora, network operator decides how that information is shown to an individual. Then the user, with her own interests and resource constraints selectively acts on a

Contemporary online social platforms present individuals with social signals in the form of news feed on their peers' activities. On networks such as Facebook, Quora, network operator decides how that information is shown to an individual. Then the user, with her own interests and resource constraints selectively acts on a subset of items presented to her. The network operator again, shows that activity to a selection of peers, and thus creating a behavioral loop. That mechanism of interaction and information flow raises some very interesting questions such as: can network operator design social signals to promote a particular activity like sustainability, public health care awareness, or to promote a specific product? The focus of my thesis is to answer that question. In this thesis, I develop a framework to personalize social signals for users to guide their activities on an online platform. As the result, we gradually nudge the activity distribution on the platform from the initial distribution p to the target distribution q. My work is particularly applicable to guiding collaborations, guiding collective actions, and online advertising. In particular, I first propose a probabilistic model on how users behave and how information flows on the platform. The main part of this thesis after that discusses the Influence Individuals through Social Signals (IISS) framework. IISS consists of four main components: (1) Learner: it learns users' interests and characteristics from their historical activities using Bayesian model, (2) Calculator: it uses gradient descent method to compute the intermediate activity distributions, (3) Selector: it selects users who can be influenced to adopt or drop specific activities, (4) Designer: it personalizes social signals for each user. I evaluate the performance of IISS framework by simulation on several network topologies such as preferential attachment, small world, and random. I show that the framework gradually nudges users' activities to approach the target distribution. I use both simulation and mathematical method to analyse convergence properties such as how fast and how close we can approach the target distribution. When the number of activities is 3, I show that for about 45% of target distributions, we can achieve KL-divergence as low as 0.05. But for some other distributions KL-divergence can be as large as 0.5.
ContributorsLe, Tien D (Author) / Sundaram, Hari (Thesis advisor) / Davulcu, Hasan (Thesis advisor) / Liu, Huan (Committee member) / Arizona State University (Publisher)
Created2014
152585-Thumbnail Image.png
Description
Uranium (U) contamination has been attracting public concern, and many researchers are investigating principles and applications of U remediation. The overall goal of my research is to understand the versatile roles of sulfate-reducing bacteria (SRB) in uranium bioremediation, including direct involvement (reducing U) and indirect involvement (protecting U reoxidation). I

Uranium (U) contamination has been attracting public concern, and many researchers are investigating principles and applications of U remediation. The overall goal of my research is to understand the versatile roles of sulfate-reducing bacteria (SRB) in uranium bioremediation, including direct involvement (reducing U) and indirect involvement (protecting U reoxidation). I pursue this goal by studying Desulfovibro vuglaris, a representative SRB. For direct involvement, I performed experiments on uranium bioreduction and uraninite (UO2) production in batch tests and in a H2-based membrane biofilm reactor (MBfR) inoculated with D. vuglaris. In summary, D. vuglaris was able to immobilize soluble U(VI) by enzymatically reducing it to insoluble U(IV), and the nanocrystallinte UO2 was associated with the biomass. In the MBfR system, although D. vuglaris failed to form a biofilm, other microbial groups capable of U(VI) reduction formed a biofilm, and up to 95% U removal was achieved during a long-term operation. For the indirect involvement, I studied the production and characterization of and biogenic iron sulfide (FeS) in batch tests. In summary, D. vuglaris produced nanocrystalline FeS, a potential redox buffer to protect UO2 from remobilization by O2. My results demonstrate that a variety of controllable environmental parameters, including pH, free sulfide, and types of Fe sources and electron donors, significantly determined the characteristics of both biogenic solids, and those characteristics should affect U-sequestrating performance by SRB. Overall, my results provide a baseline for exploiting effective and sustainable approaches to U bioremediation, including the application of the novel MBfR technology to U sequestration from groundwater and biogenic FeS for protecting remobilization of sequestrated U, as well as the microbe-relevant tools to optimize U sequestration applicable in reality.
ContributorsZhou, Chen (Author) / Rittmann, Bruce E. (Thesis advisor) / Krajmalnik-Brown, Rosa (Committee member) / Torres, César I (Committee member) / Arizona State University (Publisher)
Created2014
152158-Thumbnail Image.png
Description
Most data cleaning systems aim to go from a given deterministic dirty database to another deterministic but clean database. Such an enterprise pre–supposes that it is in fact possible for the cleaning process to uniquely recover the clean versions of each dirty data tuple. This is not possible in many

Most data cleaning systems aim to go from a given deterministic dirty database to another deterministic but clean database. Such an enterprise pre–supposes that it is in fact possible for the cleaning process to uniquely recover the clean versions of each dirty data tuple. This is not possible in many cases, where the most a cleaning system can do is to generate a (hopefully small) set of clean candidates for each dirty tuple. When the cleaning system is required to output a deterministic database, it is forced to pick one clean candidate (say the "most likely" candidate) per tuple. Such an approach can lead to loss of information. For example, consider a situation where there are three equally likely clean candidates of a dirty tuple. An appealing alternative that avoids such an information loss is to abandon the requirement that the output database be deterministic. In other words, even though the input (dirty) database is deterministic, I allow the reconstructed database to be probabilistic. Although such an approach does avoid the information loss, it also brings forth several challenges. For example, how many alternatives should be kept per tuple in the reconstructed database? Maintaining too many alternatives increases the size of the reconstructed database, and hence the query processing time. Second, while processing queries on the probabilistic database may well increase recall, how would they affect the precision of the query processing? In this thesis, I investigate these questions. My investigation is done in the context of a data cleaning system called BayesWipe that has the capability of producing multiple clean candidates per each dirty tuple, along with the probability that they are the correct cleaned version. I represent these alternatives as tuples in a tuple disjoint probabilistic database, and use the Mystiq system to process queries on it. This probabilistic reconstruction (called BayesWipe–PDB) is compared to a deterministic reconstruction (called BayesWipe–DET)—where the most likely clean candidate for each tuple is chosen, and the rest of the alternatives discarded.
ContributorsRihan, Preet Inder Singh (Author) / Kambhampati, Subbarao (Thesis advisor) / Liu, Huan (Committee member) / Davulcu, Hasan (Committee member) / Arizona State University (Publisher)
Created2013