Matching Items (40)
151890-Thumbnail Image.png
Description
Gender and sex are often conflated. Our laws, policies, and even science establish sex and gender as intrinsically linked and dimorphic in nature. This dissertation examines the relationship between sex and gender and the repercussions of this linked dimorphism in the realms of law, politics, and science. Chapter One identifies

Gender and sex are often conflated. Our laws, policies, and even science establish sex and gender as intrinsically linked and dimorphic in nature. This dissertation examines the relationship between sex and gender and the repercussions of this linked dimorphism in the realms of law, politics, and science. Chapter One identifies the legal climate for changing one's sexual identity post-surgical reassignment. It pays particular attention to the ability of postsurgical transsexuals to marry in their acquired sex. Chapter Two considers the process for identifying the sex of athletes for the purposes of participation in sex-segregated athletic events, specifically the role of testing and standards for categorization. Chapter Three explores the process of identifying and assigning the sex of intersex children. Chapter Four examines the process of prenatal sex selection and its ethical implications. Chapter Four also offers an anticipatory governance framework to address these implications.
ContributorsParsi, John (Author) / Crittenden, Jack (Thesis advisor) / Guston, David H. (Committee member) / Marchant, Gary (Committee member) / Arizona State University (Publisher)
Created2013
152422-Thumbnail Image.png
Description
With the growth of IT products and sophisticated software in various operating systems, I observe that security risks in systems are skyrocketing constantly. Consequently, Security Assessment is now considered as one of primary security mechanisms to measure assurance of systems since systems that are not compliant with security requirements may

With the growth of IT products and sophisticated software in various operating systems, I observe that security risks in systems are skyrocketing constantly. Consequently, Security Assessment is now considered as one of primary security mechanisms to measure assurance of systems since systems that are not compliant with security requirements may lead adversaries to access critical information by circumventing security practices. In order to ensure security, considerable efforts have been spent to develop security regulations by facilitating security best-practices. Applying shared security standards to the system is critical to understand vulnerabilities and prevent well-known threats from exploiting vulnerabilities. However, many end users tend to change configurations of their systems without paying attention to the security. Hence, it is not straightforward to protect systems from being changed by unconscious users in a timely manner. Detecting the installation of harmful applications is not sufficient since attackers may exploit risky software as well as commonly used software. In addition, checking the assurance of security configurations periodically is disadvantageous in terms of time and cost due to zero-day attacks and the timing attacks that can leverage the window between each security checks. Therefore, event-driven monitoring approach is critical to continuously assess security of a target system without ignoring a particular window between security checks and lessen the burden of exhausted task to inspect the entire configurations in the system. Furthermore, the system should be able to generate a vulnerability report for any change initiated by a user if such changes refer to the requirements in the standards and turn out to be vulnerable. Assessing various systems in distributed environments also requires to consistently applying standards to each environment. Such a uniformed consistent assessment is important because the way of assessment approach for detecting security vulnerabilities may vary across applications and operating systems. In this thesis, I introduce an automated event-driven security assessment framework to overcome and accommodate the aforementioned issues. I also discuss the implementation details that are based on the commercial-off-the-self technologies and testbed being established to evaluate approach. Besides, I describe evaluation results that demonstrate the effectiveness and practicality of the approaches.
ContributorsSeo, Jeong-Jin (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Lee, Joohyung (Committee member) / Arizona State University (Publisher)
Created2014
152315-Thumbnail Image.png
Description
ABSTRACT Whole genome sequencing (WGS) and whole exome sequencing (WES) are two comprehensive genomic tests which use next-generation sequencing technology to sequence most of the 3.2 billion base pairs in a human genome (WGS) or many of the estimated 22,000 protein-coding genes in the genome (WES). The promises offered from

ABSTRACT Whole genome sequencing (WGS) and whole exome sequencing (WES) are two comprehensive genomic tests which use next-generation sequencing technology to sequence most of the 3.2 billion base pairs in a human genome (WGS) or many of the estimated 22,000 protein-coding genes in the genome (WES). The promises offered from WGS/WES are: to identify suspected yet unidentified genetic diseases, to characterize the genomic mutations in a tumor to identify targeted therapeutic agents and, to predict future diseases with the hope of promoting disease prevention strategies and/or offering early treatment. Promises notwithstanding, sequencing a human genome presents several interrelated challenges: how to adequately analyze, interpret, store, reanalyze and apply an unprecedented amount of genomic data (with uncertain clinical utility) to patient care? In addition, genomic data has the potential to become integral for improving the medical care of an individual and their family, years after a genome is sequenced. Current informed consent protocols do not adequately address the unique challenges and complexities inherent to the process of WGS/WES. This dissertation constructs a novel informed consent process for individuals considering WGS/WES, capable of fulfilling both legal and ethical requirements of medical consent while addressing the intricacies of WGS/WES, ultimately resulting in a more effective consenting experience. To better understand components of an effective consenting experience, the first part of this dissertation traces the historical origin of the informed consent process to identify the motivations, rationales and institutional commitments that sustain our current consenting protocols for genetic testing. After understanding the underlying commitments that shape our current informed consent protocols, I discuss the effectiveness of the informed consent process from an ethical and legal standpoint. I illustrate how WGS/WES introduces new complexities to the informed consent process and assess whether informed consent protocols proposed for WGS/WES address these complexities. The last section of this dissertation describes a novel informed consent process for WGS/WES, constructed from the original ethical intent of informed consent, analysis of existing informed consent protocols, and my own observations as a genetic counselor for what constitutes an effective consenting experience.
ContributorsHunt, Katherine (Author) / Hurlbut, J. Benjamin (Thesis advisor) / Robert, Jason S. (Thesis advisor) / Maienschein, Jane (Committee member) / Northfelt, Donald W. (Committee member) / Marchant, Gary (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2013
152278-Thumbnail Image.png
Description
The digital forensics community has neglected email forensics as a process, despite the fact that email remains an important tool in the commission of crime. Current forensic practices focus mostly on that of disk forensics, while email forensics is left as an analysis task stemming from that practice. As there

The digital forensics community has neglected email forensics as a process, despite the fact that email remains an important tool in the commission of crime. Current forensic practices focus mostly on that of disk forensics, while email forensics is left as an analysis task stemming from that practice. As there is no well-defined process to be used for email forensics the comprehensiveness, extensibility of tools, uniformity of evidence, usefulness in collaborative/distributed environments, and consistency of investigations are hindered. At present, there exists little support for discovering, acquiring, and representing web-based email, despite its widespread use. To remedy this, a systematic process which includes discovering, acquiring, and representing web-based email for email forensics which is integrated into the normal forensic analysis workflow, and which accommodates the distinct characteristics of email evidence will be presented. This process focuses on detecting the presence of non-obvious artifacts related to email accounts, retrieving the data from the service provider, and representing email in a well-structured format based on existing standards. As a result, developers and organizations can collaboratively create and use analysis tools that can analyze email evidence from any source in the same fashion and the examiner can access additional data relevant to their forensic cases. Following, an extensible framework implementing this novel process-driven approach has been implemented in an attempt to address the problems of comprehensiveness, extensibility, uniformity, collaboration/distribution, and consistency within forensic investigations involving email evidence.
ContributorsPaglierani, Justin W (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Santanam, Raghu T (Committee member) / Arizona State University (Publisher)
Created2013
152891-Thumbnail Image.png
Description
Leo Kanner first described autism in his 1943 article in Nervous Child titled "Autistic Disturbances of Affective Contact". Throughout, he describes the eleven children with autism in exacting detail. In the closing paragraphs, the parents of autistic children are described as emotionally cold. Yet, he concludes that the condition as

Leo Kanner first described autism in his 1943 article in Nervous Child titled "Autistic Disturbances of Affective Contact". Throughout, he describes the eleven children with autism in exacting detail. In the closing paragraphs, the parents of autistic children are described as emotionally cold. Yet, he concludes that the condition as he described it was innate. Since its publication, his observations about parents have been a source of controversy surrounding the original definition of autism.

Thus far, histories about autism have pointed to descriptions of parents of autistic children with the claim that Kanner abstained from assigning them causal significance. Understanding the theoretical context in which Kanner's practice was embedded is essential to sorting out how he could have held such seemingly contrary views simultaneously.

This thesis illustrates that Kanner held an explicitly descriptive frame of reference toward his eleven child patients, their parents, and autism. Adolf Meyer, his mentor at Johns Hopkins, trained him to make detailed life-charts under a clinical framework called psychobiology. By understanding that Kanner was a psychobiologist by training, I revisit the original definition of autism as a category of mental disorder and restate its terms. This history illuminates the theoretical context of autism's discovery and has important implications for the first definition of autism amidst shifting theories of childhood mental disorders and the place of the natural sciences in defining them.
ContributorsCohmer, Sean (Author) / Hurlbut, James B (Thesis advisor) / Maienschein, Jane (Committee member) / Laubichler, Manfred (Committee member) / Arizona State University (Publisher)
Created2014
153032-Thumbnail Image.png
Description
Most existing security decisions for both defending and attacking are made based on some deterministic approaches that only give binary answers. Even though these approaches can achieve low false positive rate for decision making, they have high false negative rates due to the lack of accommodations to new attack methods

Most existing security decisions for both defending and attacking are made based on some deterministic approaches that only give binary answers. Even though these approaches can achieve low false positive rate for decision making, they have high false negative rates due to the lack of accommodations to new attack methods and defense techniques. In this dissertation, I study how to discover and use patterns with uncertainty and randomness to counter security challenges. By extracting and modeling patterns in security events, I am able to handle previously unknown security events with quantified confidence, rather than simply making binary decisions. In particular, I cope with the following four real-world security challenges by modeling and analyzing with pattern-based approaches: 1) How to detect and attribute previously unknown shellcode? I propose instruction sequence abstraction that extracts coarse-grained patterns from an instruction sequence and use Markov chain-based model and support vector machines to detect and attribute shellcode; 2) How to safely mitigate routing attacks in mobile ad hoc networks? I identify routing table change patterns caused by attacks, propose an extended Dempster-Shafer theory to measure the risk of such changes, and use a risk-aware response mechanism to mitigate routing attacks; 3) How to model, understand, and guess human-chosen picture passwords? I analyze collected human-chosen picture passwords, propose selection function that models patterns in password selection, and design two algorithms to optimize password guessing paths; and 4) How to identify influential figures and events in underground social networks? I analyze collected underground social network data, identify user interaction patterns, and propose a suite of measures for systematically discovering and mining adversarial evidence. By solving these four problems, I demonstrate that discovering and using patterns could help deal with challenges in computer security, network security, human-computer interaction security, and social network security.
ContributorsZhao, Ziming (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Huang, Dijiang (Committee member) / Santanam, Raghu (Committee member) / Arizona State University (Publisher)
Created2014
153173-Thumbnail Image.png
Description
Neuroimaging has appeared in the courtroom as a type of `evidence' to support claims about whether or not criminals should be held accountable for their crimes. Yet the ability to abstract notions of culpability and criminal behavior with confidence from these imagines is unclear. As there remains much to be

Neuroimaging has appeared in the courtroom as a type of `evidence' to support claims about whether or not criminals should be held accountable for their crimes. Yet the ability to abstract notions of culpability and criminal behavior with confidence from these imagines is unclear. As there remains much to be discovered in the relationship between personal responsibility, criminal behavior, and neurological abnormalities, questions have been raised toward neuroimaging as an appropriate means to validate these claims.

This project explores the limits and legitimacy of neuroimaging as a means of understanding behavior and culpability in determining appropriate criminal sentencing. It highlights key philosophical issues surrounding the ability to use neuroimaging to support this process, and proposes a method of ensuring their proper use. By engaging case studies and a thought experiment, this project illustrates the circumstances in which neuroimaging may assist in identifying particular characteristics relevant for criminal sentencing.

I argue that it is not a question of whether or not neuroimaging itself holds validity in determining a criminals guilt or motives, but rather a proper application of the issue is to focus on the way in which information regarding these images is communicated from the `expert' scientists to the `non-expert' making decisions about the sentence that are most important. Those who are considering this information's relevance, a judge or jury, are typically not well versed in criminal neuroscience and interpreting the significance of different images. I advocate the way in which this information is communicated from the scientist-informer to the decision-maker parallels in importance to its actual meaning.

As a solution, I engage Roger Pielke's model of honest brokering as a solution to ensure the appropriate use of neuroimaging in determining criminal responsibility and sentencing. A thought experiment follows to highlight the limits of science, engage philosophical repercussions, and illustrate honest brokering as a means of resolution. To achieve this, a hypothetical dialogue reminiscent of Kenneth Schaffner's `tools for talking' with behavioral geneticists and courtroom professionals will exemplify these ideas.
ContributorsTaddeo, Sarah (Author) / Robert, Jason S (Thesis advisor) / Marchant, Gary (Committee member) / Hurlbut, James B (Committee member) / Arizona State University (Publisher)
Created2014
149803-Thumbnail Image.png
Description
With the advent of technologies such as web services, service oriented architecture and cloud computing, modern organizations have to deal with policies such as Firewall policies to secure the networks, XACML (eXtensible Access Control Markup Language) policies for controlling the access to critical information as well as resources. Management of

With the advent of technologies such as web services, service oriented architecture and cloud computing, modern organizations have to deal with policies such as Firewall policies to secure the networks, XACML (eXtensible Access Control Markup Language) policies for controlling the access to critical information as well as resources. Management of these policies is an extremely important task in order to avoid unintended security leakages via illegal accesses, while maintaining proper access to services for legitimate users. Managing and maintaining access control policies manually over long period of time is an error prone task due to their inherent complex nature. Existing tools and mechanisms for policy management use different approaches for different types of policies. This research thesis represents a generic framework to provide an unified approach for policy analysis and management of different types of policies. Generic approach captures the common semantics and structure of different access control policies with the notion of policy ontology. Policy ontology representation is then utilized for effectively analyzing and managing the policies. This thesis also discusses a proof-of-concept implementation of the proposed generic framework and demonstrates how efficiently this unified approach can be used for analysis and management of different types of access control policies.
ContributorsKulkarni, Ketan (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Huang, Dijiang (Committee member) / Arizona State University (Publisher)
Created2011
150771-Thumbnail Image.png
Description
Corporations in biomedicine hold significant power and influence, in both political and personal spheres. The decisions these companies make about ethics are critically important, as they help determine what products are developed, how they are developed, how they are promoted, and potentially even how they are regulated. In the last

Corporations in biomedicine hold significant power and influence, in both political and personal spheres. The decisions these companies make about ethics are critically important, as they help determine what products are developed, how they are developed, how they are promoted, and potentially even how they are regulated. In the last fifteen years, for-profit private companies have been assembling bioethics committees to help resolve dilemmas that require informed deliberation about ethical, legal, scientific, and economic considerations. Private sector bioethics committees represent an important innovation in the governance of emerging technologies, with corporations taking a lead role in deciding what is ethically appropriate or problematic. And yet, we know very little about these committees, including their structures, memberships, mandates, authority, and impact. Drawing on an extensive literature review and qualitative analysis of semi-structured interviews with executives, scientists and board members, this dissertation provides an in-depth analysis of the Ethics and Public Policy Board at SmithKline Beecham, the Ethics Advisory Board at Advanced Cell Technology, and the Bioethics Committee at Eli Lilly and offers insights about how ideas of bioethics and governance are currently imagined and enacted within corporations. The SmithKline Beecham board was the first private sector bioethics committee; its mandate was to explore, in a comprehensive and balanced analysis, the ethics of macro trends in science and technology. The Advanced Cell Technology board was created to be like a watchdog for the company, to prevent them from making major errors. The Eli Lilly board is different than the others in that it is made up mostly of internal employees and does research ethics consultations within the company. These private sector bioethics committees evaluate and construct new boundaries between their private interests and the public values they claim to promote. Findings from this dissertation show that criticisms of private sector bioethics that focus narrowly on financial conflicts of interest and a lack of transparency obscure analysis of the ideas about governance (about expertise, credibility and authority) that emerge from these structures and hamper serious debate about the possible impacts of moving ethical deliberation from the public to the private sector.
ContributorsBrian, Jennifer (Author) / Robert, Jason S (Thesis advisor) / Maienschein, Jane (Committee member) / Hurlbut, James B (Committee member) / Sarewitz, Daniel (Committee member) / Brown, Mark B. (Committee member) / Moreno, Jonathan D. (Committee member) / Arizona State University (Publisher)
Created2012
150827-Thumbnail Image.png
Description
In modern healthcare environments, there is a strong need to create an infrastructure that reduces time-consuming efforts and costly operations to obtain a patient's complete medical record and uniformly integrates this heterogeneous collection of medical data to deliver it to the healthcare professionals. As a result, healthcare providers are more

In modern healthcare environments, there is a strong need to create an infrastructure that reduces time-consuming efforts and costly operations to obtain a patient's complete medical record and uniformly integrates this heterogeneous collection of medical data to deliver it to the healthcare professionals. As a result, healthcare providers are more willing to shift their electronic medical record (EMR) systems to clouds that can remove the geographical distance barriers among providers and patient. Even though cloud-based EMRs have received considerable attention since it would help achieve lower operational cost and better interoperability with other healthcare providers, the adoption of security-aware cloud systems has become an extremely important prerequisite for bringing interoperability and efficient management to the healthcare industry. Since a shared electronic health record (EHR) essentially represents a virtualized aggregation of distributed clinical records from multiple healthcare providers, sharing of such integrated EHRs may comply with various authorization policies from these data providers. In this work, we focus on the authorized and selective sharing of EHRs among several parties with different duties and objectives that satisfies access control and compliance issues in healthcare cloud computing environments. We present a secure medical data sharing framework to support selective sharing of composite EHRs aggregated from various healthcare providers and compliance of HIPAA regulations. Our approach also ensures that privacy concerns need to be accommodated for processing access requests to patients' healthcare information. To realize our proposed approach, we design and implement a cloud-based EHRs sharing system. In addition, we describe case studies and evaluation results to demonstrate the effectiveness and efficiency of our approach.
ContributorsWu, Ruoyu (Author) / Ahn, Gail-Joon (Thesis advisor) / Yau, Stephen S. (Committee member) / Huang, Dijiang (Committee member) / Arizona State University (Publisher)
Created2012