This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 189
Filtering by

Clear all filters

152156-Thumbnail Image.png
Description
Once perceived as an unimportant occurrence in living organisms, cell degeneration was reconfigured as an important biological phenomenon in development, aging, health, and diseases in the twentieth century. This dissertation tells a twentieth-century history of scientific investigations on cell degeneration, including cell death and aging. By describing four central developments

Once perceived as an unimportant occurrence in living organisms, cell degeneration was reconfigured as an important biological phenomenon in development, aging, health, and diseases in the twentieth century. This dissertation tells a twentieth-century history of scientific investigations on cell degeneration, including cell death and aging. By describing four central developments in cell degeneration research with the four major chapters, I trace the emergence of the degenerating cell as a scientific object, describe the generations of a variety of concepts, interpretations and usages associated with cell death and aging, and analyze the transforming influences of the rising cell degeneration research. Particularly, the four chapters show how the changing scientific practices about cellular life in embryology, cell culture, aging research, and molecular biology of Caenorhabditis elegans shaped the interpretations about cell degeneration in the twentieth-century as life-shaping, limit-setting, complex, yet regulated. These events created and consolidated important concepts in life sciences such as programmed cell death, the Hayflick limit, apoptosis, and death genes. These cases also transformed the material and epistemic practices about the end of cellular life subsequently and led to the formations of new research communities. The four cases together show the ways cell degeneration became a shared subject between molecular cell biology, developmental biology, gerontology, oncology, and pathology of degenerative diseases. These practices and perspectives created a special kind of interconnectivity between different fields and led to a level of interdisciplinarity within cell degeneration research by the early 1990s.
ContributorsJiang, Lijing (Author) / Maienschein, Jane (Thesis advisor) / Laubichler, Manfred (Thesis advisor) / Hurlbut, James (Committee member) / Creath, Richard (Committee member) / White, Michael (Committee member) / Arizona State University (Publisher)
Created2013
151653-Thumbnail Image.png
Description
Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling

Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling language in order to enhance expressivity, such as incorporating aggregates and interfaces with ontologies. Also, in order to overcome the grounding bottleneck of computation in ASP, there are increasing interests in integrating ASP with other computing paradigms, such as Constraint Programming (CP) and Satisfiability Modulo Theories (SMT). Due to the non-monotonic nature of the ASP semantics, such enhancements turned out to be non-trivial and the existing extensions are not fully satisfactory. We observe that one main reason for the difficulties rooted in the propositional semantics of ASP, which is limited in handling first-order constructs (such as aggregates and ontologies) and functions (such as constraint variables in CP and SMT) in natural ways. This dissertation presents a unifying view on these extensions by viewing them as instances of formulas with generalized quantifiers and intensional functions. We extend the first-order stable model semantics by by Ferraris, Lee, and Lifschitz to allow generalized quantifiers, which cover aggregate, DL-atoms, constraints and SMT theory atoms as special cases. Using this unifying framework, we study and relate different extensions of ASP. We also present a tight integration of ASP with SMT, based on which we enhance action language C+ to handle reasoning about continuous changes. Our framework yields a systematic approach to study and extend non-monotonic languages.
ContributorsMeng, Yunsong (Author) / Lee, Joohyung (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Baral, Chitta (Committee member) / Fainekos, Georgios (Committee member) / Lifschitz, Vladimir (Committee member) / Arizona State University (Publisher)
Created2013
152422-Thumbnail Image.png
Description
With the growth of IT products and sophisticated software in various operating systems, I observe that security risks in systems are skyrocketing constantly. Consequently, Security Assessment is now considered as one of primary security mechanisms to measure assurance of systems since systems that are not compliant with security requirements may

With the growth of IT products and sophisticated software in various operating systems, I observe that security risks in systems are skyrocketing constantly. Consequently, Security Assessment is now considered as one of primary security mechanisms to measure assurance of systems since systems that are not compliant with security requirements may lead adversaries to access critical information by circumventing security practices. In order to ensure security, considerable efforts have been spent to develop security regulations by facilitating security best-practices. Applying shared security standards to the system is critical to understand vulnerabilities and prevent well-known threats from exploiting vulnerabilities. However, many end users tend to change configurations of their systems without paying attention to the security. Hence, it is not straightforward to protect systems from being changed by unconscious users in a timely manner. Detecting the installation of harmful applications is not sufficient since attackers may exploit risky software as well as commonly used software. In addition, checking the assurance of security configurations periodically is disadvantageous in terms of time and cost due to zero-day attacks and the timing attacks that can leverage the window between each security checks. Therefore, event-driven monitoring approach is critical to continuously assess security of a target system without ignoring a particular window between security checks and lessen the burden of exhausted task to inspect the entire configurations in the system. Furthermore, the system should be able to generate a vulnerability report for any change initiated by a user if such changes refer to the requirements in the standards and turn out to be vulnerable. Assessing various systems in distributed environments also requires to consistently applying standards to each environment. Such a uniformed consistent assessment is important because the way of assessment approach for detecting security vulnerabilities may vary across applications and operating systems. In this thesis, I introduce an automated event-driven security assessment framework to overcome and accommodate the aforementioned issues. I also discuss the implementation details that are based on the commercial-off-the-self technologies and testbed being established to evaluate approach. Besides, I describe evaluation results that demonstrate the effectiveness and practicality of the approaches.
ContributorsSeo, Jeong-Jin (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Lee, Joohyung (Committee member) / Arizona State University (Publisher)
Created2014
152315-Thumbnail Image.png
Description
ABSTRACT Whole genome sequencing (WGS) and whole exome sequencing (WES) are two comprehensive genomic tests which use next-generation sequencing technology to sequence most of the 3.2 billion base pairs in a human genome (WGS) or many of the estimated 22,000 protein-coding genes in the genome (WES). The promises offered from

ABSTRACT Whole genome sequencing (WGS) and whole exome sequencing (WES) are two comprehensive genomic tests which use next-generation sequencing technology to sequence most of the 3.2 billion base pairs in a human genome (WGS) or many of the estimated 22,000 protein-coding genes in the genome (WES). The promises offered from WGS/WES are: to identify suspected yet unidentified genetic diseases, to characterize the genomic mutations in a tumor to identify targeted therapeutic agents and, to predict future diseases with the hope of promoting disease prevention strategies and/or offering early treatment. Promises notwithstanding, sequencing a human genome presents several interrelated challenges: how to adequately analyze, interpret, store, reanalyze and apply an unprecedented amount of genomic data (with uncertain clinical utility) to patient care? In addition, genomic data has the potential to become integral for improving the medical care of an individual and their family, years after a genome is sequenced. Current informed consent protocols do not adequately address the unique challenges and complexities inherent to the process of WGS/WES. This dissertation constructs a novel informed consent process for individuals considering WGS/WES, capable of fulfilling both legal and ethical requirements of medical consent while addressing the intricacies of WGS/WES, ultimately resulting in a more effective consenting experience. To better understand components of an effective consenting experience, the first part of this dissertation traces the historical origin of the informed consent process to identify the motivations, rationales and institutional commitments that sustain our current consenting protocols for genetic testing. After understanding the underlying commitments that shape our current informed consent protocols, I discuss the effectiveness of the informed consent process from an ethical and legal standpoint. I illustrate how WGS/WES introduces new complexities to the informed consent process and assess whether informed consent protocols proposed for WGS/WES address these complexities. The last section of this dissertation describes a novel informed consent process for WGS/WES, constructed from the original ethical intent of informed consent, analysis of existing informed consent protocols, and my own observations as a genetic counselor for what constitutes an effective consenting experience.
ContributorsHunt, Katherine (Author) / Hurlbut, J. Benjamin (Thesis advisor) / Robert, Jason S. (Thesis advisor) / Maienschein, Jane (Committee member) / Northfelt, Donald W. (Committee member) / Marchant, Gary (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2013
152590-Thumbnail Image.png
Description
Access control is necessary for information assurance in many of today's applications such as banking and electronic health record. Access control breaches are critical security problems that can result from unintended and improper implementation of security policies. Security testing can help identify security vulnerabilities early and avoid unexpected expensive cost

Access control is necessary for information assurance in many of today's applications such as banking and electronic health record. Access control breaches are critical security problems that can result from unintended and improper implementation of security policies. Security testing can help identify security vulnerabilities early and avoid unexpected expensive cost in handling breaches for security architects and security engineers. The process of security testing which involves creating tests that effectively examine vulnerabilities is a challenging task. Role-Based Access Control (RBAC) has been widely adopted to support fine-grained access control. However, in practice, due to its complexity including role management, role hierarchy with hundreds of roles, and their associated privileges and users, systematically testing RBAC systems is crucial to ensure the security in various domains ranging from cyber-infrastructure to mission-critical applications. In this thesis, we introduce i) a security testing technique for RBAC systems considering the principle of maximum privileges, the structure of the role hierarchy, and a new security test coverage criterion; ii) a MTBDD (Multi-Terminal Binary Decision Diagram) based representation of RBAC security policy including RHMTBDD (Role Hierarchy MTBDD) to efficiently generate effective positive and negative security test cases; and iii) a security testing framework which takes an XACML-based RBAC security policy as an input, parses it into a RHMTBDD representation and then generates positive and negative test cases. We also demonstrate the efficacy of our approach through case studies.
ContributorsGupta, Poonam (Author) / Ahn, Gail-Joon (Thesis advisor) / Collofello, James (Committee member) / Huang, Dijiang (Committee member) / Arizona State University (Publisher)
Created2014
152605-Thumbnail Image.png
Description
In 1997, developmental biologist Michael Richardson compared his research team's embryo photographs to Ernst Haeckel's 1874 embryo drawings and called Haeckel's work noncredible.Science soon published <“>Haeckel's Embryos: Fraud Rediscovered,<”> and Richardson's comments further reinvigorated criticism of Haeckel by others with articles in The American Biology Teacher, <“>Haeckel's Embryos and Evolution:

In 1997, developmental biologist Michael Richardson compared his research team's embryo photographs to Ernst Haeckel's 1874 embryo drawings and called Haeckel's work noncredible.Science soon published <“>Haeckel's Embryos: Fraud Rediscovered,<”> and Richardson's comments further reinvigorated criticism of Haeckel by others with articles in The American Biology Teacher, <“>Haeckel's Embryos and Evolution: Setting the Record Straight <”> and the New York Times, <“>Biology Text Illustrations more Fiction than Fact.<”> Meanwhile, others emphatically stated that the goal of comparative embryology was not to resurrect Haeckel's work. At the center of the controversy was Haeckel's no-longer-accepted idea of recapitulation. Haeckel believed that the development of an embryo revealed the adult stages of the organism's ancestors. Haeckel represented this idea with drawings of vertebrate embryos at similar developmental stages. This is Haeckel's embryo grid, the most common of all illustrations in biology textbooks. Yet, Haeckel's embryo grids are much more complex than any textbook explanation. I examined 240 high school biology textbooks, from 1907 to 2010, for embryo grids. I coded and categorized the grids according to accompanying discussion of (a) embryonic similarities (b) recapitulation, (c) common ancestors, and (d) evolution. The textbooks show changing narratives. Embryo grids gained prominence in the 1940s, and the trend continued until criticisms of Haeckel reemerged in the late 1990s, resulting in (a) grids with fewer organisms and developmental stages or (b) no grid at all. Discussion about embryos and evolution dropped significantly.
ContributorsWellner, Karen L (Author) / Maienschein, Jane (Thesis advisor) / Ellison, Karin D. (Committee member) / Creath, Richard (Committee member) / Robert, Jason S. (Committee member) / Laubichler, Manfred D. (Committee member) / Arizona State University (Publisher)
Created2014
152278-Thumbnail Image.png
Description
The digital forensics community has neglected email forensics as a process, despite the fact that email remains an important tool in the commission of crime. Current forensic practices focus mostly on that of disk forensics, while email forensics is left as an analysis task stemming from that practice. As there

The digital forensics community has neglected email forensics as a process, despite the fact that email remains an important tool in the commission of crime. Current forensic practices focus mostly on that of disk forensics, while email forensics is left as an analysis task stemming from that practice. As there is no well-defined process to be used for email forensics the comprehensiveness, extensibility of tools, uniformity of evidence, usefulness in collaborative/distributed environments, and consistency of investigations are hindered. At present, there exists little support for discovering, acquiring, and representing web-based email, despite its widespread use. To remedy this, a systematic process which includes discovering, acquiring, and representing web-based email for email forensics which is integrated into the normal forensic analysis workflow, and which accommodates the distinct characteristics of email evidence will be presented. This process focuses on detecting the presence of non-obvious artifacts related to email accounts, retrieving the data from the service provider, and representing email in a well-structured format based on existing standards. As a result, developers and organizations can collaboratively create and use analysis tools that can analyze email evidence from any source in the same fashion and the examiner can access additional data relevant to their forensic cases. Following, an extensible framework implementing this novel process-driven approach has been implemented in an attempt to address the problems of comprehensiveness, extensibility, uniformity, collaboration/distribution, and consistency within forensic investigations involving email evidence.
ContributorsPaglierani, Justin W (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Santanam, Raghu T (Committee member) / Arizona State University (Publisher)
Created2013
152495-Thumbnail Image.png
Description
Attribute Based Access Control (ABAC) mechanisms have been attracting a lot of interest from the research community in recent times. This is especially because of the flexibility and extensibility it provides by using attributes assigned to subjects as the basis for access control. ABAC enables an administrator of a server

Attribute Based Access Control (ABAC) mechanisms have been attracting a lot of interest from the research community in recent times. This is especially because of the flexibility and extensibility it provides by using attributes assigned to subjects as the basis for access control. ABAC enables an administrator of a server to enforce access policies on the data, services and other such resources fairly easily. It also accommodates new policies and changes to existing policies gracefully, thereby making it a potentially good mechanism for implementing access control in large systems, particularly in today's age of Cloud Computing. However management of the attributes in ABAC environment is an area that has been little touched upon. Having a mechanism to allow multiple ABAC based systems to share data and resources can go a long way in making ABAC scalable. At the same time each system should be able to specify their own attribute sets independently. In the research presented in this document a new mechanism is proposed that would enable users to share resources and data in a cloud environment using ABAC techniques in a distributed manner. The focus is mainly on decentralizing the access policy specifications for the shared data so that each data owner can specify the access policy independent of others. The concept of ontologies and semantic web is introduced in the ABAC paradigm that would help in giving a scalable structure to the attributes and also allow systems having different sets of attributes to communicate and share resources.
ContributorsPrabhu Verleker, Ashwin Narayan (Author) / Huang, Dijiang (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Dasgupta, Partha (Committee member) / Arizona State University (Publisher)
Created2014
152385-Thumbnail Image.png
Description
This thesis addresses the ever increasing threat of botnets in the smartphone domain and focuses on the Android platform and the botnets using Online Social Networks (OSNs) as Command and Control (C&C;) medium. With any botnet, C&C; is one of the components on which the survival of botnet depends. Individual

This thesis addresses the ever increasing threat of botnets in the smartphone domain and focuses on the Android platform and the botnets using Online Social Networks (OSNs) as Command and Control (C&C;) medium. With any botnet, C&C; is one of the components on which the survival of botnet depends. Individual bots use the C&C; channel to receive commands and send the data. This thesis develops active host based approach for identifying the presence of bot based on the anomalies in the usage patterns of the user before and after the bot is installed on the user smartphone and alerting the user to the presence of the bot. A profile is constructed for each user based on the regular web usage patterns (achieved by intercepting the http(s) traffic) and implementing machine learning techniques to continuously learn the user's behavior and changes in the behavior and all the while looking for any anomalies in the user behavior above a threshold which will cause the user to be notified of the anomalous traffic. A prototype bot which uses OSN s as C&C; channel is constructed and used for testing. Users are given smartphones(Nexus 4 and Galaxy Nexus) running Application proxy which intercepts http(s) traffic and relay it to a server which uses the traffic and constructs the model for a particular user and look for any signs of anomalies. This approach lays the groundwork for the future host-based counter measures for smartphone botnets using OSN s as C&C; channel.
ContributorsKilari, Vishnu Teja (Author) / Xue, Guoliang (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Dasgupta, Partha (Committee member) / Arizona State University (Publisher)
Created2013
152351-Thumbnail Image.png
Description
Lung Cancer Alliance, a nonprofit organization, released the "No One Deserves to Die" advertising campaign in June 2012. The campaign visuals presented a clean, simple message to the public: the stigma associated with lung cancer drives marginalization of lung cancer patients. Lung Cancer Alliance (LCA) asserts that negative public attitude

Lung Cancer Alliance, a nonprofit organization, released the "No One Deserves to Die" advertising campaign in June 2012. The campaign visuals presented a clean, simple message to the public: the stigma associated with lung cancer drives marginalization of lung cancer patients. Lung Cancer Alliance (LCA) asserts that negative public attitude toward lung cancer stems from unacknowledged moral judgments that generate 'stigma.' The campaign materials are meant to expose and challenge these common public category-making processes that occur when subconsciously evaluating lung cancer patients. These processes involve comparison, perception of difference, and exclusion. The campaign implies that society sees suffering of lung cancer patients as indicative of moral failure, thus, not warranting assistance from society, which leads to marginalization of the diseased. Attributing to society a morally laden view of the disease, the campaign extends this view to its logical end and makes it explicit: lung cancer patients no longer deserve to live because they themselves caused the disease (by smoking). This judgment and resulting marginalization is, according to LCA, evident in the ways lung cancer patients are marginalized relative to other diseases via minimal research funding, high- mortality rates and low awareness of the disease. Therefore, society commits an injustice against those with lung cancer. This research analyzes the relationship between disease, identity-making, and responsibilities within society as represented by this stigma framework. LCA asserts that society understands lung cancer in terms of stigma, and advocates that society's understanding of lung cancer should be shifted from a stigma framework toward a medical framework. Analysis of identity-making and responsibility encoded in both frameworks contributes to evaluation of the significance of reframing this disease. One aim of this thesis is to explore the relationship between these frameworks in medical sociology. The results show a complex interaction that suggest trading one frame for another will not destigmatize the lung cancer patient. Those interactions cause tangible harms, such as high mortality rates, and there are important implications for other communities that experience a stigmatized disease.
ContributorsCalvelage, Victoria (Author) / Hurlbut, J. Benjamin (Thesis advisor) / Maienschein, Jane (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2013