This collection includes both ASU Theses and Dissertations, submitted by graduate students, and the Barrett, Honors College theses submitted by undergraduate students. 

Displaying 1 - 10 of 112
152112-Thumbnail Image.png
Description
With the advent of social media (like Twitter, Facebook etc.,) people are easily sharing their opinions, sentiments and enforcing their ideologies on others like never before. Even people who are otherwise socially inactive would like to share their thoughts on current affairs by tweeting and sharing news feeds with their

With the advent of social media (like Twitter, Facebook etc.,) people are easily sharing their opinions, sentiments and enforcing their ideologies on others like never before. Even people who are otherwise socially inactive would like to share their thoughts on current affairs by tweeting and sharing news feeds with their friends and acquaintances. In this thesis study, we chose Twitter as our main data platform to analyze shifts and movements of 27 political organizations in Indonesia. So far, we have collected over 30 million tweets and 150,000 news articles from RSS feeds of the corresponding organizations for our analysis. For Twitter data extraction, we developed a multi-threaded application which seamlessly extracts, cleans and stores millions of tweets matching our keywords from Twitter Streaming API. For keyword extraction, we used topics and perspectives which were extracted using n-grams techniques and later approved by our social scientists. After the data is extracted, we aggregate the tweet contents that belong to every user on a weekly basis. Finally, we applied linear and logistic regression using SLEP, an open source sparse learning package to compute weekly score for users and mapping them to one of the 27 organizations on a radical or counter radical scale. Since, we are mapping users to organizations on a weekly basis, we are able to track user's behavior and important new events that triggered shifts among users between organizations. This thesis study can further be extended to identify topics and organization specific influential users and new users from various social media platforms like Facebook, YouTube etc. can easily be mapped to existing organizations on a radical or counter-radical scale.
ContributorsPoornachandran, Sathishkumar (Author) / Davulcu, Hasan (Thesis advisor) / Sen, Arunabha (Committee member) / Woodward, Mark (Committee member) / Arizona State University (Publisher)
Created2013
151275-Thumbnail Image.png
Description
The pay-as-you-go economic model of cloud computing increases the visibility, traceability, and verifiability of software costs. Application developers must understand how their software uses resources when running in the cloud in order to stay within budgeted costs and/or produce expected profits. Cloud computing's unique economic model also leads naturally to

The pay-as-you-go economic model of cloud computing increases the visibility, traceability, and verifiability of software costs. Application developers must understand how their software uses resources when running in the cloud in order to stay within budgeted costs and/or produce expected profits. Cloud computing's unique economic model also leads naturally to an earn-as-you-go profit model for many cloud based applications. These applications can benefit from low level analyses for cost optimization and verification. Testing cloud applications to ensure they meet monetary cost objectives has not been well explored in the current literature. When considering revenues and costs for cloud applications, the resource economic model can be scaled down to the transaction level in order to associate source code with costs incurred while running in the cloud. Both static and dynamic analysis techniques can be developed and applied to understand how and where cloud applications incur costs. Such analyses can help optimize (i.e. minimize) costs and verify that they stay within expected tolerances. An adaptation of Worst Case Execution Time (WCET) analysis is presented here to statically determine worst case monetary costs of cloud applications. This analysis is used to produce an algorithm for determining control flow paths within an application that can exceed a given cost threshold. The corresponding results are used to identify path sections that contribute most to cost excess. A hybrid approach for determining cost excesses is also presented that is comprised mostly of dynamic measurements but that also incorporates calculations that are based on the static analysis approach. This approach uses operational profiles to increase the precision and usefulness of the calculations.
ContributorsBuell, Kevin, Ph.D (Author) / Collofello, James (Thesis advisor) / Davulcu, Hasan (Committee member) / Lindquist, Timothy (Committee member) / Sen, Arunabha (Committee member) / Arizona State University (Publisher)
Created2012
152315-Thumbnail Image.png
Description
ABSTRACT Whole genome sequencing (WGS) and whole exome sequencing (WES) are two comprehensive genomic tests which use next-generation sequencing technology to sequence most of the 3.2 billion base pairs in a human genome (WGS) or many of the estimated 22,000 protein-coding genes in the genome (WES). The promises offered from

ABSTRACT Whole genome sequencing (WGS) and whole exome sequencing (WES) are two comprehensive genomic tests which use next-generation sequencing technology to sequence most of the 3.2 billion base pairs in a human genome (WGS) or many of the estimated 22,000 protein-coding genes in the genome (WES). The promises offered from WGS/WES are: to identify suspected yet unidentified genetic diseases, to characterize the genomic mutations in a tumor to identify targeted therapeutic agents and, to predict future diseases with the hope of promoting disease prevention strategies and/or offering early treatment. Promises notwithstanding, sequencing a human genome presents several interrelated challenges: how to adequately analyze, interpret, store, reanalyze and apply an unprecedented amount of genomic data (with uncertain clinical utility) to patient care? In addition, genomic data has the potential to become integral for improving the medical care of an individual and their family, years after a genome is sequenced. Current informed consent protocols do not adequately address the unique challenges and complexities inherent to the process of WGS/WES. This dissertation constructs a novel informed consent process for individuals considering WGS/WES, capable of fulfilling both legal and ethical requirements of medical consent while addressing the intricacies of WGS/WES, ultimately resulting in a more effective consenting experience. To better understand components of an effective consenting experience, the first part of this dissertation traces the historical origin of the informed consent process to identify the motivations, rationales and institutional commitments that sustain our current consenting protocols for genetic testing. After understanding the underlying commitments that shape our current informed consent protocols, I discuss the effectiveness of the informed consent process from an ethical and legal standpoint. I illustrate how WGS/WES introduces new complexities to the informed consent process and assess whether informed consent protocols proposed for WGS/WES address these complexities. The last section of this dissertation describes a novel informed consent process for WGS/WES, constructed from the original ethical intent of informed consent, analysis of existing informed consent protocols, and my own observations as a genetic counselor for what constitutes an effective consenting experience.
ContributorsHunt, Katherine (Author) / Hurlbut, J. Benjamin (Thesis advisor) / Robert, Jason S. (Thesis advisor) / Maienschein, Jane (Committee member) / Northfelt, Donald W. (Committee member) / Marchant, Gary (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2013
152164-Thumbnail Image.png
Description
Contention based IEEE 802.11MAC uses the binary exponential backoff algorithm (BEB) for the contention resolution. The protocol suffers poor performance in the heavily loaded networks and MANETs, high collision rate and packet drops, probabilistic delay guarantees, and unfairness. Many backoff strategies were proposed to improve the performance of IEEE 802.11

Contention based IEEE 802.11MAC uses the binary exponential backoff algorithm (BEB) for the contention resolution. The protocol suffers poor performance in the heavily loaded networks and MANETs, high collision rate and packet drops, probabilistic delay guarantees, and unfairness. Many backoff strategies were proposed to improve the performance of IEEE 802.11 but all ignore the network topology and demand. Persistence is defined as the fraction of time a node is allowed to transmit, when this allowance should take into account topology and load, it is topology and load aware persistence (TLA). We develop a relation between contention window size and the TLA-persistence. We implement a new backoff strategy where the TLA-persistence is defined as the lexicographic max-min channel allocation. We use a centralized algorithm to calculate each node's TLApersistence and then convert it into a contention window size. The new backoff strategy is evaluated in simulation, comparing with that of the IEEE 802.11 using BEB. In most of the static scenarios like exposed terminal, flow in the middle, star topology, and heavy loaded multi-hop networks and in MANETs, through the simulation study, we show that the new backoff strategy achieves higher overall average throughput as compared to that of the IEEE 802.11 using BEB.
ContributorsBhyravajosyula, Sai Vishnu Kiran (Author) / Syrotiuk, Violet R. (Thesis advisor) / Sen, Arunabha (Committee member) / Richa, Andrea (Committee member) / Arizona State University (Publisher)
Created2013
152500-Thumbnail Image.png
Description
As networks are playing an increasingly prominent role in different aspects of our lives, there is a growing awareness that improving their performance is of significant importance. In order to enhance performance of networks, it is essential that scarce networking resources be allocated smartly to match the continuously changing network

As networks are playing an increasingly prominent role in different aspects of our lives, there is a growing awareness that improving their performance is of significant importance. In order to enhance performance of networks, it is essential that scarce networking resources be allocated smartly to match the continuously changing network environment. This dissertation focuses on two different kinds of networks - communication and social, and studies resource allocation problems in these networks. The study on communication networks is further divided into different networking technologies - wired and wireless, optical and mobile, airborne and terrestrial. Since nodes in an airborne network (AN) are heterogeneous and mobile, the design of a reliable and robust AN is highly complex. The dissertation studies connectivity and fault-tolerance issues in ANs and proposes algorithms to compute the critical transmission range in fault free, faulty and delay tolerant scenarios. Just as in the case of ANs, power optimization and fault tolerance are important issues in wireless sensor networks (WSN). In a WSN, a tree structure is often used to deliver sensor data to a sink node. In a tree, failure of a node may disconnect the tree. The dissertation investigates the problem of enhancing the fault tolerance capability of data gathering trees in WSN. The advent of OFDM technology provides an opportunity for efficient resource utilization in optical networks and also introduces a set of novel problems, such as routing and spectrum allocation (RSA) problem. This dissertation proves that RSA problem is NP-complete even when the network topology is a chain, and proposes approximation algorithms. In the domain of social networks, the focus of this dissertation is study of influence propagation in presence of active adversaries. In a social network multiple vendors may attempt to influence the nodes in a competitive fashion. This dissertation investigates the scenario where the first vendor has already chosen a set of nodes and the second vendor, with the knowledge of the choice of the first, attempts to identify a smallest set of nodes so that after the influence propagation, the second vendor's market share is larger than the first.
ContributorsShirazipourazad, Shahrzad (Author) / Sen, Arunabha (Committee member) / Xue, Guoliang (Committee member) / Richa, Andrea (Committee member) / Saripalli, Srikanth (Committee member) / Arizona State University (Publisher)
Created2014
152351-Thumbnail Image.png
Description
Lung Cancer Alliance, a nonprofit organization, released the "No One Deserves to Die" advertising campaign in June 2012. The campaign visuals presented a clean, simple message to the public: the stigma associated with lung cancer drives marginalization of lung cancer patients. Lung Cancer Alliance (LCA) asserts that negative public attitude

Lung Cancer Alliance, a nonprofit organization, released the "No One Deserves to Die" advertising campaign in June 2012. The campaign visuals presented a clean, simple message to the public: the stigma associated with lung cancer drives marginalization of lung cancer patients. Lung Cancer Alliance (LCA) asserts that negative public attitude toward lung cancer stems from unacknowledged moral judgments that generate 'stigma.' The campaign materials are meant to expose and challenge these common public category-making processes that occur when subconsciously evaluating lung cancer patients. These processes involve comparison, perception of difference, and exclusion. The campaign implies that society sees suffering of lung cancer patients as indicative of moral failure, thus, not warranting assistance from society, which leads to marginalization of the diseased. Attributing to society a morally laden view of the disease, the campaign extends this view to its logical end and makes it explicit: lung cancer patients no longer deserve to live because they themselves caused the disease (by smoking). This judgment and resulting marginalization is, according to LCA, evident in the ways lung cancer patients are marginalized relative to other diseases via minimal research funding, high- mortality rates and low awareness of the disease. Therefore, society commits an injustice against those with lung cancer. This research analyzes the relationship between disease, identity-making, and responsibilities within society as represented by this stigma framework. LCA asserts that society understands lung cancer in terms of stigma, and advocates that society's understanding of lung cancer should be shifted from a stigma framework toward a medical framework. Analysis of identity-making and responsibility encoded in both frameworks contributes to evaluation of the significance of reframing this disease. One aim of this thesis is to explore the relationship between these frameworks in medical sociology. The results show a complex interaction that suggest trading one frame for another will not destigmatize the lung cancer patient. Those interactions cause tangible harms, such as high mortality rates, and there are important implications for other communities that experience a stigmatized disease.
ContributorsCalvelage, Victoria (Author) / Hurlbut, J. Benjamin (Thesis advisor) / Maienschein, Jane (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2013
152926-Thumbnail Image.png
Description
Teaching evolution has been shown to be a challenge for faculty, in both K-12 and postsecondary education. Many of these challenges stem from perceived conflicts not only between religion and evolution, but also faculty beliefs about religion, it's compatibility with evolutionary theory, and it's proper role in classroom curriculum. Studies

Teaching evolution has been shown to be a challenge for faculty, in both K-12 and postsecondary education. Many of these challenges stem from perceived conflicts not only between religion and evolution, but also faculty beliefs about religion, it's compatibility with evolutionary theory, and it's proper role in classroom curriculum. Studies suggest that if educators engage with students' religious beliefs and identity, this may help students have positive attitudes towards evolution. The aim of this study was to reveal attitudes and beliefs professors have about addressing religion and providing religious scientist role models to students when teaching evolution. 15 semi-structured interviews of tenured biology professors were conducted at a large Midwestern universiy regarding their beliefs, experiences, and strategies teaching evolution and particularly, their willingness to address religion in a class section on evolution. Following a qualitative analysis of transcripts, professors did not agree on whether or not it is their job to help students accept evolution (although the majority said it is not), nor did they agree on a definition of "acceptance of evolution". Professors are willing to engage in students' religious beliefs, if this would help their students accept evolution. Finally, professors perceived many challenges to engaging students' religious beliefs in a science classroom such as the appropriateness of the material for a science class, large class sizes, and time constraints. Given the results of this study, the author concludes that instructors must come to a consensus about their goals as biology educators as well as what "acceptance of evolution" means, before they can realistically apply the engagement of student's religious beliefs and identity as an educational strategy.
ContributorsBarnes, Maryann Elizabeth (Author) / Brownell, Sara E (Thesis advisor) / Brem, Sarah K. (Thesis advisor) / Lynch, John M. (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2014
153342-Thumbnail Image.png
Description
Resource allocation is one of the most challenging issues policy decision makers must address. The objective of this thesis is to explore the resource allocation from an economical perspective, i.e., how to purchase resources in order to satisfy customers' requests. In this thesis, we attend to answer the question: when

Resource allocation is one of the most challenging issues policy decision makers must address. The objective of this thesis is to explore the resource allocation from an economical perspective, i.e., how to purchase resources in order to satisfy customers' requests. In this thesis, we attend to answer the question: when and how to buy resources to fulfill customers' demands with minimum costs?

The first topic studied in this thesis is resource allocation in cloud networks. Cloud computing heralded an era where resources (such as computation and storage) can be scaled up and down elastically and on demand. This flexibility is attractive for its cost effectiveness: the cloud resource price depends on the actual utilization over time. This thesis studies two critical problems in cloud networks, focusing on the economical aspects of the resource allocation in the cloud/virtual networks, and proposes six algorithms to address the resource allocation problems for different discount models. The first problem attends a scenario where the virtual network provider offers different contracts to the service provider. Four algorithms for resource contract migration are proposed under two pricing models: Pay-as-You-Come and Pay-as-You-Go. The second problem explores a scenario where a cloud provider offers k contracts each with a duration and a rate respectively and a customer buys these contracts in order to satisfy its resource demand. This work shows that this problem can be seen as a 2-dimensional generalization of the classic online parking permit problem, and present a k-competitive online algorithm and an optimal online algorithm.

The second topic studied in this thesis is to explore how resource allocation and purchasing strategies work in our daily life. For example, is it worth buying a Yoga pass which costs USD 100 for ten entries, although it will expire at the end of this year? Decisions like these are part of our daily life, yet, not much is known today about good online strategies to buy discount vouchers with expiration dates. This work hence introduces a Discount Voucher Purchase Problem (DVPP). It aims to optimize the strategies for buying discount vouchers, i.e., coupons, vouchers, groupons which are valid only during a certain time period. The DVPP comes in three flavors: (1) Once Expire Lose Everything (OELE): Vouchers lose their entire value after expiration. (2) Once Expire Lose Discount (OELD): Vouchers lose their discount value after expiration. (3) Limited Purchasing Window (LPW): Vouchers have the property of OELE and can only be bought during a certain time window.

This work explores online algorithms with a provable competitive ratio against a clairvoyant offline algorithm, even in the worst case. In particular, this work makes the following contributions: we present a 4-competitive algorithm for OELE, an 8-competitive algorithm for OELD, and a lower bound for LPW. We also present an optimal offline algorithm for OELE and LPW, and show it is a 2-approximation solution for OELD.
ContributorsHu, Xinhui (Author) / Richa, Andrea (Thesis advisor) / Schmid, Stefan (Committee member) / Sen, Arunabha (Committee member) / Xue, Guoliang (Committee member) / Arizona State University (Publisher)
Created2015
153477-Thumbnail Image.png
Description
Blind and visually impaired individuals have historically demonstrated a low participation in the fields of science, engineering, mathematics, and technology (STEM). This low participation is reflected in both their education and career choices. Despite the establishment of the Americans with Disabilities Act (ADA) and the Individuals with Disabilities Education Act

Blind and visually impaired individuals have historically demonstrated a low participation in the fields of science, engineering, mathematics, and technology (STEM). This low participation is reflected in both their education and career choices. Despite the establishment of the Americans with Disabilities Act (ADA) and the Individuals with Disabilities Education Act (IDEA), blind and visually impaired (BVI) students continue to academically fall below the level of their sighted peers in the areas of science and math. Although this deficit is created by many factors, this study focuses on the lack of adequate accessible image based materials. Traditional methods for creating accessible image materials for the vision impaired have included detailed verbal descriptions accompanying an image or conversion into a simplified tactile graphic. It is very common that no substitute materials will be provided to students within STEM courses because they are image rich disciplines and often include a large number images, diagrams and charts. Additionally, images that are translated into text or simplified into basic line drawings are frequently inadequate because they rely on the interpretations of resource personnel who do not have expertise in STEM. Within this study, a method to create a new type of tactile 3D image was developed using High Density Polyethylene (HDPE) and Computer Numeric Control (CNC) milling. These tactile image boards preserve high levels of detail when compared to the original print image. To determine the discernibility and effectiveness of tactile images, these customizable boards were tested in various

university classrooms as well as in participation studies which included BVI and sighted students. Results from these studies indicate that tactile images are discernable and were found to improve performance in lab exercises as much as 60% for those with visual impairment. Incorporating tactile HDPE 3D images into a classroom setting was shown to increase the interest, participation and performance of BVI students suggesting that this type of 3D tactile image should be incorporated into STEM classes to increase the participation of these students and improve the level of training they receive in science and math.
ContributorsGonzales, Ashleigh (Author) / Baluch, Debra P (Thesis advisor) / Maienschein, Jane (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2015
153478-Thumbnail Image.png
Description
US Senate is the venue of political debates where the federal bills are formed and voted. Senators show their support/opposition along the bills with their votes. This information makes it possible to extract the polarity of the senators. Similarly, blogosphere plays an increasingly important role as a forum for public

US Senate is the venue of political debates where the federal bills are formed and voted. Senators show their support/opposition along the bills with their votes. This information makes it possible to extract the polarity of the senators. Similarly, blogosphere plays an increasingly important role as a forum for public debate. Authors display sentiment toward issues, organizations or people using a natural language.

In this research, given a mixed set of senators/blogs debating on a set of political issues from opposing camps, I use signed bipartite graphs for modeling debates, and I propose an algorithm for partitioning both the opinion holders (senators or blogs) and the issues (bills or topics) comprising the debate into binary opposing camps. Simultaneously, my algorithm scales the entities on a univariate scale. Using this scale, a researcher can identify moderate and extreme senators/blogs within each camp, and polarizing versus unifying issues. Through performance evaluations I show that my proposed algorithm provides an effective solution to the problem, and performs much better than existing baseline algorithms adapted to solve this new problem. In my experiments, I used both real data from political blogosphere and US Congress records, as well as synthetic data which were obtained by varying polarization and degree distribution of the vertices of the graph to show the robustness of my algorithm.

I also applied my algorithm on all the terms of the US Senate to the date for longitudinal analysis and developed a web based interactive user interface www.PartisanScale.com to visualize the analysis.

US politics is most often polarized with respect to the left/right alignment of the entities. However, certain issues do not reflect the polarization due to political parties, but observe a split correlating to the demographics of the senators, or simply receive consensus. I propose a hierarchical clustering algorithm that identifies groups of bills that share the same polarization characteristics. I developed a web based interactive user interface www.ControversyAnalysis.com to visualize the clusters while providing a synopsis through distribution charts, word clouds, and heat maps.
ContributorsGokalp, Sedat (Author) / Davulcu, Hasan (Thesis advisor) / Sen, Arunabha (Committee member) / Liu, Huan (Committee member) / Woodward, Mark (Committee member) / Arizona State University (Publisher)
Created2015