This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 138
Filtering by

Clear all filters

152112-Thumbnail Image.png
Description
With the advent of social media (like Twitter, Facebook etc.,) people are easily sharing their opinions, sentiments and enforcing their ideologies on others like never before. Even people who are otherwise socially inactive would like to share their thoughts on current affairs by tweeting and sharing news feeds with their

With the advent of social media (like Twitter, Facebook etc.,) people are easily sharing their opinions, sentiments and enforcing their ideologies on others like never before. Even people who are otherwise socially inactive would like to share their thoughts on current affairs by tweeting and sharing news feeds with their friends and acquaintances. In this thesis study, we chose Twitter as our main data platform to analyze shifts and movements of 27 political organizations in Indonesia. So far, we have collected over 30 million tweets and 150,000 news articles from RSS feeds of the corresponding organizations for our analysis. For Twitter data extraction, we developed a multi-threaded application which seamlessly extracts, cleans and stores millions of tweets matching our keywords from Twitter Streaming API. For keyword extraction, we used topics and perspectives which were extracted using n-grams techniques and later approved by our social scientists. After the data is extracted, we aggregate the tweet contents that belong to every user on a weekly basis. Finally, we applied linear and logistic regression using SLEP, an open source sparse learning package to compute weekly score for users and mapping them to one of the 27 organizations on a radical or counter radical scale. Since, we are mapping users to organizations on a weekly basis, we are able to track user's behavior and important new events that triggered shifts among users between organizations. This thesis study can further be extended to identify topics and organization specific influential users and new users from various social media platforms like Facebook, YouTube etc. can easily be mapped to existing organizations on a radical or counter-radical scale.
ContributorsPoornachandran, Sathishkumar (Author) / Davulcu, Hasan (Thesis advisor) / Sen, Arunabha (Committee member) / Woodward, Mark (Committee member) / Arizona State University (Publisher)
Created2013
152255-Thumbnail Image.png
Description
Many manmade chemicals used in consumer products are ultimately washed down the drain and are collected in municipal sewers. Efficient chemical monitoring at wastewater treatment (WWT) plants thus may provide up-to-date information on chemical usage rates for epidemiological assessments. The objective of the present study was to extrapolate this concept,

Many manmade chemicals used in consumer products are ultimately washed down the drain and are collected in municipal sewers. Efficient chemical monitoring at wastewater treatment (WWT) plants thus may provide up-to-date information on chemical usage rates for epidemiological assessments. The objective of the present study was to extrapolate this concept, termed 'sewage epidemiology', to include municipal sewage sludge (MSS) in identifying and prioritizing contaminants of emerging concern (CECs). To test this the following specific aims were defined: i) to screen and identify CECs in nationally representative samples of MSS and to provide nationwide inventories of CECs in U.S. MSS; ii) to investigate the fate and persistence in MSS-amended soils, of sludge-borne hydrophobic CECs; and iii) to develop an analytical tool relying on contaminant levels in MSS as an indicator for identifying and prioritizing hydrophobic CECs. Chemicals that are primarily discharged to the sewage systems (alkylphenol surfactants) and widespread persistent organohalogen pollutants (perfluorochemicals and brominated flame retardants) were analyzed in nationally representative MSS samples. A meta-analysis showed that CECs contribute about 0.04-0.15% to the total dry mass of MSS, a mass equivalent of 2,700-7,900 metric tonnes of chemicals annually. An analysis of archived mesocoms from a sludge weathering study showed that 64 CECs persisted in MSS/soil mixtures over the course of the experiment, with half-lives ranging between 224 and >990 days; these results suggest an inherent persistence of CECs that accumulate in MSS. A comparison of the spectrum of chemicals (n=52) analyzed in nationally representative biological specimens from humans and MSS revealed 70% overlap. This observed co-occurrence of contaminants in both matrices suggests that MSS may serve as an indicator for ongoing human exposures and body burdens of pollutants in humans. In conclusion, I posit that this novel approach in sewage epidemiology may serve to pre-screen and prioritize the several thousands of known or suspected CECs to identify those that are most prone to pose a risk to human health and the environment.
ContributorsVenkatesan, Arjunkrishna (Author) / Halden, Rolf U. (Thesis advisor) / Westerhoff, Paul (Committee member) / Fox, Peter (Committee member) / Arizona State University (Publisher)
Created2013
151951-Thumbnail Image.png
Description
The consumption of feedstocks from agriculture and forestry by current biofuel production has raised concerns about food security and land availability. In the meantime, intensive human activities have created a large amount of marginal lands that require management. This study investigated the viability of aligning land management with biofuel production

The consumption of feedstocks from agriculture and forestry by current biofuel production has raised concerns about food security and land availability. In the meantime, intensive human activities have created a large amount of marginal lands that require management. This study investigated the viability of aligning land management with biofuel production on marginal lands. Biofuel crop production on two types of marginal lands, namely urban vacant lots and abandoned mine lands (AMLs), were assessed. The investigation of biofuel production on urban marginal land was carried out in Pittsburgh between 2008 and 2011, using the sunflower gardens developed by a Pittsburgh non-profit as an example. Results showed that the crops from urban marginal lands were safe for biofuel. The crop yield was 20% of that on agricultural land while the low input agriculture was used in crop cultivation. The energy balance analysis demonstrated that the sunflower gardens could produce a net energy return even at the current low yield. Biofuel production on AML was assessed from experiments conducted in a greenhouse for sunflower, soybean, corn, canola and camelina. The research successfully created an industrial symbiosis by using bauxite as soil amendment to enable plant growth on very acidic mine refuse. Phytoremediation and soil amendments were found to be able to effectively reduce contamination in the AML and its runoff. Results from this research supported that biofuel production on marginal lands could be a unique and feasible option for cultivating biofuel feedstocks.
ContributorsZhao, Xi (Author) / Landis, Amy (Thesis advisor) / Fox, Peter (Committee member) / Chester, Mikhail (Committee member) / Arizona State University (Publisher)
Created2013
151275-Thumbnail Image.png
Description
The pay-as-you-go economic model of cloud computing increases the visibility, traceability, and verifiability of software costs. Application developers must understand how their software uses resources when running in the cloud in order to stay within budgeted costs and/or produce expected profits. Cloud computing's unique economic model also leads naturally to

The pay-as-you-go economic model of cloud computing increases the visibility, traceability, and verifiability of software costs. Application developers must understand how their software uses resources when running in the cloud in order to stay within budgeted costs and/or produce expected profits. Cloud computing's unique economic model also leads naturally to an earn-as-you-go profit model for many cloud based applications. These applications can benefit from low level analyses for cost optimization and verification. Testing cloud applications to ensure they meet monetary cost objectives has not been well explored in the current literature. When considering revenues and costs for cloud applications, the resource economic model can be scaled down to the transaction level in order to associate source code with costs incurred while running in the cloud. Both static and dynamic analysis techniques can be developed and applied to understand how and where cloud applications incur costs. Such analyses can help optimize (i.e. minimize) costs and verify that they stay within expected tolerances. An adaptation of Worst Case Execution Time (WCET) analysis is presented here to statically determine worst case monetary costs of cloud applications. This analysis is used to produce an algorithm for determining control flow paths within an application that can exceed a given cost threshold. The corresponding results are used to identify path sections that contribute most to cost excess. A hybrid approach for determining cost excesses is also presented that is comprised mostly of dynamic measurements but that also incorporates calculations that are based on the static analysis approach. This approach uses operational profiles to increase the precision and usefulness of the calculations.
ContributorsBuell, Kevin, Ph.D (Author) / Collofello, James (Thesis advisor) / Davulcu, Hasan (Committee member) / Lindquist, Timothy (Committee member) / Sen, Arunabha (Committee member) / Arizona State University (Publisher)
Created2012
151500-Thumbnail Image.png
Description
Communication networks, both wired and wireless, are expected to have a certain level of fault-tolerance capability.These networks are also expected to ensure a graceful degradation in performance when some of the network components fail. Traditional studies on fault tolerance in communication networks, for the most part, make no assumptions regarding

Communication networks, both wired and wireless, are expected to have a certain level of fault-tolerance capability.These networks are also expected to ensure a graceful degradation in performance when some of the network components fail. Traditional studies on fault tolerance in communication networks, for the most part, make no assumptions regarding the location of node/link faults, i.e., the faulty nodes and links may be close to each other or far from each other. However, in many real life scenarios, there exists a strong spatial correlation among the faulty nodes and links. Such failures are often encountered in disaster situations, e.g., natural calamities or enemy attacks. In presence of such region-based faults, many of traditional network analysis and fault-tolerant metrics, that are valid under non-spatially correlated faults, are no longer applicable. To this effect, the main thrust of this research is design and analysis of robust networks in presence of such region-based faults. One important finding of this research is that if some prior knowledge is available on the maximum size of the region that might be affected due to a region-based fault, this piece of knowledge can be effectively utilized for resource efficient design of networks. It has been shown in this dissertation that in some scenarios, effective utilization of this knowledge may result in substantial saving is transmission power in wireless networks. In this dissertation, the impact of region-based faults on the connectivity of wireless networks has been studied and a new metric, region-based connectivity, is proposed to measure the fault-tolerance capability of a network. In addition, novel metrics, such as the region-based component decomposition number(RBCDN) and region-based largest component size(RBLCS) have been proposed to capture the network state, when a region-based fault disconnects the network. Finally, this dissertation presents efficient resource allocation techniques that ensure tolerance against region-based faults, in distributed file storage networks and data center networks.
ContributorsBanerjee, Sujogya (Author) / Sen, Arunabha (Thesis advisor) / Xue, Guoliang (Committee member) / Richa, Andrea (Committee member) / Hurlbert, Glenn (Committee member) / Arizona State University (Publisher)
Created2013
151517-Thumbnail Image.png
Description
Data mining is increasing in importance in solving a variety of industry problems. Our initiative involves the estimation of resource requirements by skill set for future projects by mining and analyzing actual resource consumption data from past projects in the semiconductor industry. To achieve this goal we face difficulties like

Data mining is increasing in importance in solving a variety of industry problems. Our initiative involves the estimation of resource requirements by skill set for future projects by mining and analyzing actual resource consumption data from past projects in the semiconductor industry. To achieve this goal we face difficulties like data with relevant consumption information but stored in different format and insufficient data about project attributes to interpret consumption data. Our first goal is to clean the historical data and organize it into meaningful structures for analysis. Once the preprocessing on data is completed, different data mining techniques like clustering is applied to find projects which involve resources of similar skillsets and which involve similar complexities and size. This results in "resource utilization templates" for groups of related projects from a resource consumption perspective. Then project characteristics are identified which generate this diversity in headcounts and skillsets. These characteristics are not currently contained in the data base and are elicited from the managers of historical projects. This represents an opportunity to improve the usefulness of the data collection system for the future. The ultimate goal is to match the product technical features with the resource requirement for projects in the past as a model to forecast resource requirements by skill set for future projects. The forecasting model is developed using linear regression with cross validation of the training data as the past project execution are relatively few in number. Acceptable levels of forecast accuracy are achieved relative to human experts' results and the tool is applied to forecast some future projects' resource demand.
ContributorsBhattacharya, Indrani (Author) / Sen, Arunabha (Thesis advisor) / Kempf, Karl G. (Thesis advisor) / Liu, Huan (Committee member) / Arizona State University (Publisher)
Created2013
152164-Thumbnail Image.png
Description
Contention based IEEE 802.11MAC uses the binary exponential backoff algorithm (BEB) for the contention resolution. The protocol suffers poor performance in the heavily loaded networks and MANETs, high collision rate and packet drops, probabilistic delay guarantees, and unfairness. Many backoff strategies were proposed to improve the performance of IEEE 802.11

Contention based IEEE 802.11MAC uses the binary exponential backoff algorithm (BEB) for the contention resolution. The protocol suffers poor performance in the heavily loaded networks and MANETs, high collision rate and packet drops, probabilistic delay guarantees, and unfairness. Many backoff strategies were proposed to improve the performance of IEEE 802.11 but all ignore the network topology and demand. Persistence is defined as the fraction of time a node is allowed to transmit, when this allowance should take into account topology and load, it is topology and load aware persistence (TLA). We develop a relation between contention window size and the TLA-persistence. We implement a new backoff strategy where the TLA-persistence is defined as the lexicographic max-min channel allocation. We use a centralized algorithm to calculate each node's TLApersistence and then convert it into a contention window size. The new backoff strategy is evaluated in simulation, comparing with that of the IEEE 802.11 using BEB. In most of the static scenarios like exposed terminal, flow in the middle, star topology, and heavy loaded multi-hop networks and in MANETs, through the simulation study, we show that the new backoff strategy achieves higher overall average throughput as compared to that of the IEEE 802.11 using BEB.
ContributorsBhyravajosyula, Sai Vishnu Kiran (Author) / Syrotiuk, Violet R. (Thesis advisor) / Sen, Arunabha (Committee member) / Richa, Andrea (Committee member) / Arizona State University (Publisher)
Created2013
152167-Thumbnail Image.png
Description
Contaminants of emerging concern (CECs) present in wastewater effluent can threat its safe discharge or reuse. Additional barriers of protection can be provided using advanced or natural treatment processes. This dissertation evaluated ozonation and constructed wetlands to remove CECs from wastewater effluent. Organic CECs can be removed by hydroxyl radical

Contaminants of emerging concern (CECs) present in wastewater effluent can threat its safe discharge or reuse. Additional barriers of protection can be provided using advanced or natural treatment processes. This dissertation evaluated ozonation and constructed wetlands to remove CECs from wastewater effluent. Organic CECs can be removed by hydroxyl radical formed during ozonation, however estimating the ozone demand of wastewater effluent is complicated due to the presence of reduced inorganic species. A method was developed to estimate ozone consumption only by dissolved organic compounds and predict trace organic oxidation across multiple wastewater sources. Organic and engineered nanomaterial (ENM) CEC removal in constructed wetlands was investigated using batch experiments and continuous-flow microcosms containing decaying wetland plants. CEC removal varied depending on their physico-chemical properties, hydraulic residence time (HRT) and relative quantities of plant materials in the microcosms. At comparable HRTs, ENM removal improved with higher quantity of plant materials due to enhanced sorption which was verified in batch-scale studies with plant materials. A fate-predictive model was developed to evaluate the role of design loading rates on organic CEC removal. Areal removal rates increased with hydraulic loading rates (HLRs) and carbon loading rates (CLRs) unless photolysis was the dominant removal mechanism (e.g. atrazine). To optimize CEC removal, wetlands with different CLRs can be used in combination without lowering the net HLR. Organic CEC removal in denitrifying conditions of constructed wetlands was investigated and selected CECs (e.g. estradiol) were found to biotransform while denitrification occurred. Although level of denitrification was affected by HRT, similar impact on estradiol was not observed due to a dominant effect from plant biomass quantity. Overall, both modeling and experimental findings suggest considering CLR as an equally important factor with HRT or HLR to design constructed wetlands for CEC removal. This dissertation provided directions to select design parameters for ozonation (ozone dose) and constructed wetlands (design loading rates) to meet organic CEC removal goals. Future research is needed to understand fate of ENMs during ozonation and quantify the contributions from different transformation mechanisms occurring in the wetlands to incorporate in a model and evaluate the effect of wetland design.
ContributorsSharif, Fariya (Author) / Westerhoff, Paul (Thesis advisor) / Halden, Rolf (Committee member) / Fox, Peter (Committee member) / Herckes, Pierre (Committee member) / Arizona State University (Publisher)
Created2013
152500-Thumbnail Image.png
Description
As networks are playing an increasingly prominent role in different aspects of our lives, there is a growing awareness that improving their performance is of significant importance. In order to enhance performance of networks, it is essential that scarce networking resources be allocated smartly to match the continuously changing network

As networks are playing an increasingly prominent role in different aspects of our lives, there is a growing awareness that improving their performance is of significant importance. In order to enhance performance of networks, it is essential that scarce networking resources be allocated smartly to match the continuously changing network environment. This dissertation focuses on two different kinds of networks - communication and social, and studies resource allocation problems in these networks. The study on communication networks is further divided into different networking technologies - wired and wireless, optical and mobile, airborne and terrestrial. Since nodes in an airborne network (AN) are heterogeneous and mobile, the design of a reliable and robust AN is highly complex. The dissertation studies connectivity and fault-tolerance issues in ANs and proposes algorithms to compute the critical transmission range in fault free, faulty and delay tolerant scenarios. Just as in the case of ANs, power optimization and fault tolerance are important issues in wireless sensor networks (WSN). In a WSN, a tree structure is often used to deliver sensor data to a sink node. In a tree, failure of a node may disconnect the tree. The dissertation investigates the problem of enhancing the fault tolerance capability of data gathering trees in WSN. The advent of OFDM technology provides an opportunity for efficient resource utilization in optical networks and also introduces a set of novel problems, such as routing and spectrum allocation (RSA) problem. This dissertation proves that RSA problem is NP-complete even when the network topology is a chain, and proposes approximation algorithms. In the domain of social networks, the focus of this dissertation is study of influence propagation in presence of active adversaries. In a social network multiple vendors may attempt to influence the nodes in a competitive fashion. This dissertation investigates the scenario where the first vendor has already chosen a set of nodes and the second vendor, with the knowledge of the choice of the first, attempts to identify a smallest set of nodes so that after the influence propagation, the second vendor's market share is larger than the first.
ContributorsShirazipourazad, Shahrzad (Author) / Sen, Arunabha (Committee member) / Xue, Guoliang (Committee member) / Richa, Andrea (Committee member) / Saripalli, Srikanth (Committee member) / Arizona State University (Publisher)
Created2014
152626-Thumbnail Image.png
Description
The influence of climate variability and reclaimed wastewater on the water supply necessitates improved understanding of the treatability of trace and bulk organic matter. Dissolved organic matter (DOM) mobilized during extreme weather events and in treated wastewater includes natural organic matter (NOM), contaminants of emerging concern (CECs), and microbial extracellular

The influence of climate variability and reclaimed wastewater on the water supply necessitates improved understanding of the treatability of trace and bulk organic matter. Dissolved organic matter (DOM) mobilized during extreme weather events and in treated wastewater includes natural organic matter (NOM), contaminants of emerging concern (CECs), and microbial extracellular polymeric substances (EPS). The goal of my dissertation was to quantify the impacts of extreme weather events on DOM in surface water and downstream treatment processes, and to improve membrane filtration efficiency and CECs oxidation efficiency during water reclamation with ozone. Surface water quality, air quality and hydrologic flow rate data were used to quantify changes in DOM and turbidity following dust storms, flooding, or runoff from wildfire burn areas in central Arizona. The subsequent impacts to treatment processes and public perception of water quality were also discussed. Findings showed a correlation between dust storm events and change in surface water turbidity (R2=0.6), attenuation of increased DOM through reservoir systems, a 30-40% increase in organic carbon and a 120-600% increase in turbidity following severe flooding, and differing impacts of upland and lowland wildfires. The use of ozone to reduce membrane fouling caused by vesicles (a subcomponent of EPS) and oxidize CECs through increased hydroxyl radical (HO●) production was investigated. An "ozone dose threshold" was observed above which addition of hydrogen peroxide increased HO● production; indicating the presence of ambient promoters in wastewater. Ozonation of CECs in secondary effluent over titanium dioxide or activated carbon did not increase radial production. Vesicles fouled ultrafiltration membranes faster (20 times greater flux decline) than polysaccharides, fatty acids, or NOM. Based upon the estimated carbon distribution of secondary effluent, vesicles could be responsible for 20-60% of fouling during ultrafiltration and may play a vital role in other environmental processes as well. Ozone reduced vesicle-caused membrane fouling that, in conjunction with the presence of ambient promoters, helps to explain why low ozone dosages improve membrane flux during full-scale water reclamation.
ContributorsBarry, Michelle (Author) / Barry, Michelle C (Thesis advisor) / Westerhoff, Paul (Committee member) / Fox, Peter (Committee member) / Halden, Rolf (Committee member) / Hristovski, Kiril (Committee member) / Arizona State University (Publisher)
Created2014