This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 10 of 173
Filtering by

Clear all filters

152156-Thumbnail Image.png
Description
Once perceived as an unimportant occurrence in living organisms, cell degeneration was reconfigured as an important biological phenomenon in development, aging, health, and diseases in the twentieth century. This dissertation tells a twentieth-century history of scientific investigations on cell degeneration, including cell death and aging. By describing four central developments

Once perceived as an unimportant occurrence in living organisms, cell degeneration was reconfigured as an important biological phenomenon in development, aging, health, and diseases in the twentieth century. This dissertation tells a twentieth-century history of scientific investigations on cell degeneration, including cell death and aging. By describing four central developments in cell degeneration research with the four major chapters, I trace the emergence of the degenerating cell as a scientific object, describe the generations of a variety of concepts, interpretations and usages associated with cell death and aging, and analyze the transforming influences of the rising cell degeneration research. Particularly, the four chapters show how the changing scientific practices about cellular life in embryology, cell culture, aging research, and molecular biology of Caenorhabditis elegans shaped the interpretations about cell degeneration in the twentieth-century as life-shaping, limit-setting, complex, yet regulated. These events created and consolidated important concepts in life sciences such as programmed cell death, the Hayflick limit, apoptosis, and death genes. These cases also transformed the material and epistemic practices about the end of cellular life subsequently and led to the formations of new research communities. The four cases together show the ways cell degeneration became a shared subject between molecular cell biology, developmental biology, gerontology, oncology, and pathology of degenerative diseases. These practices and perspectives created a special kind of interconnectivity between different fields and led to a level of interdisciplinarity within cell degeneration research by the early 1990s.
ContributorsJiang, Lijing (Author) / Maienschein, Jane (Thesis advisor) / Laubichler, Manfred (Thesis advisor) / Hurlbut, James (Committee member) / Creath, Richard (Committee member) / White, Michael (Committee member) / Arizona State University (Publisher)
Created2013
151860-Thumbnail Image.png
Description
Cancer is the second leading cause of death in the United States and novel methods of treating advanced malignancies are of high importance. Of these deaths, prostate cancer and breast cancer are the second most fatal carcinomas in men and women respectively, while pancreatic cancer is the fourth most fatal

Cancer is the second leading cause of death in the United States and novel methods of treating advanced malignancies are of high importance. Of these deaths, prostate cancer and breast cancer are the second most fatal carcinomas in men and women respectively, while pancreatic cancer is the fourth most fatal in both men and women. Developing new drugs for the treatment of cancer is both a slow and expensive process. It is estimated that it takes an average of 15 years and an expense of $800 million to bring a single new drug to the market. However, it is also estimated that nearly 40% of that cost could be avoided by finding alternative uses for drugs that have already been approved by the Food and Drug Administration (FDA). The research presented in this document describes the testing, identification, and mechanistic evaluation of novel methods for treating many human carcinomas using drugs previously approved by the FDA. A tissue culture plate-based screening of FDA approved drugs will identify compounds that can be used in combination with the protein TRAIL to induce apoptosis selectively in cancer cells. Identified leads will next be optimized using high-throughput microfluidic devices to determine the most effective treatment conditions. Finally, a rigorous mechanistic analysis will be conducted to understand how the FDA-approved drug mitoxantrone, sensitizes cancer cells to TRAIL-mediated apoptosis.
ContributorsTaylor, David (Author) / Rege, Kaushal (Thesis advisor) / Jayaraman, Arul (Committee member) / Nielsen, David (Committee member) / Kodibagkar, Vikram (Committee member) / Dai, Lenore (Committee member) / Arizona State University (Publisher)
Created2013
151455-Thumbnail Image.png
Description
Although high performance, light-weight composites are increasingly being used in applications ranging from aircraft, rotorcraft, weapon systems and ground vehicles, the assurance of structural reliability remains a critical issue. In composites, damage is absorbed through various fracture processes, including fiber failure, matrix cracking and delamination. An important element in achieving

Although high performance, light-weight composites are increasingly being used in applications ranging from aircraft, rotorcraft, weapon systems and ground vehicles, the assurance of structural reliability remains a critical issue. In composites, damage is absorbed through various fracture processes, including fiber failure, matrix cracking and delamination. An important element in achieving reliable composite systems is a strong capability of assessing and inspecting physical damage of critical structural components. Installation of a robust Structural Health Monitoring (SHM) system would be very valuable in detecting the onset of composite failure. A number of major issues still require serious attention in connection with the research and development aspects of sensor-integrated reliable SHM systems for composite structures. In particular, the sensitivity of currently available sensor systems does not allow detection of micro level damage; this limits the capability of data driven SHM systems. As a fundamental layer in SHM, modeling can provide in-depth information on material and structural behavior for sensing and detection, as well as data for learning algorithms. This dissertation focusses on the development of a multiscale analysis framework, which is used to detect various forms of damage in complex composite structures. A generalized method of cells based micromechanics analysis, as implemented in NASA's MAC/GMC code, is used for the micro-level analysis. First, a baseline study of MAC/GMC is performed to determine the governing failure theories that best capture the damage progression. The deficiencies associated with various layups and loading conditions are addressed. In most micromechanics analysis, a representative unit cell (RUC) with a common fiber packing arrangement is used. The effect of variation in this arrangement within the RUC has been studied and results indicate this variation influences the macro-scale effective material properties and failure stresses. The developed model has been used to simulate impact damage in a composite beam and an airfoil structure. The model data was verified through active interrogation using piezoelectric sensors. The multiscale model was further extended to develop a coupled damage and wave attenuation model, which was used to study different damage states such as fiber-matrix debonding in composite structures with surface bonded piezoelectric sensors.
ContributorsMoncada, Albert (Author) / Chattopadhyay, Aditi (Thesis advisor) / Dai, Lenore (Committee member) / Papandreou-Suppappola, Antonia (Committee member) / Rajadas, John (Committee member) / Yekani Fard, Masoud (Committee member) / Arizona State University (Publisher)
Created2012
152315-Thumbnail Image.png
Description
ABSTRACT Whole genome sequencing (WGS) and whole exome sequencing (WES) are two comprehensive genomic tests which use next-generation sequencing technology to sequence most of the 3.2 billion base pairs in a human genome (WGS) or many of the estimated 22,000 protein-coding genes in the genome (WES). The promises offered from

ABSTRACT Whole genome sequencing (WGS) and whole exome sequencing (WES) are two comprehensive genomic tests which use next-generation sequencing technology to sequence most of the 3.2 billion base pairs in a human genome (WGS) or many of the estimated 22,000 protein-coding genes in the genome (WES). The promises offered from WGS/WES are: to identify suspected yet unidentified genetic diseases, to characterize the genomic mutations in a tumor to identify targeted therapeutic agents and, to predict future diseases with the hope of promoting disease prevention strategies and/or offering early treatment. Promises notwithstanding, sequencing a human genome presents several interrelated challenges: how to adequately analyze, interpret, store, reanalyze and apply an unprecedented amount of genomic data (with uncertain clinical utility) to patient care? In addition, genomic data has the potential to become integral for improving the medical care of an individual and their family, years after a genome is sequenced. Current informed consent protocols do not adequately address the unique challenges and complexities inherent to the process of WGS/WES. This dissertation constructs a novel informed consent process for individuals considering WGS/WES, capable of fulfilling both legal and ethical requirements of medical consent while addressing the intricacies of WGS/WES, ultimately resulting in a more effective consenting experience. To better understand components of an effective consenting experience, the first part of this dissertation traces the historical origin of the informed consent process to identify the motivations, rationales and institutional commitments that sustain our current consenting protocols for genetic testing. After understanding the underlying commitments that shape our current informed consent protocols, I discuss the effectiveness of the informed consent process from an ethical and legal standpoint. I illustrate how WGS/WES introduces new complexities to the informed consent process and assess whether informed consent protocols proposed for WGS/WES address these complexities. The last section of this dissertation describes a novel informed consent process for WGS/WES, constructed from the original ethical intent of informed consent, analysis of existing informed consent protocols, and my own observations as a genetic counselor for what constitutes an effective consenting experience.
ContributorsHunt, Katherine (Author) / Hurlbut, J. Benjamin (Thesis advisor) / Robert, Jason S. (Thesis advisor) / Maienschein, Jane (Committee member) / Northfelt, Donald W. (Committee member) / Marchant, Gary (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2013
152605-Thumbnail Image.png
Description
In 1997, developmental biologist Michael Richardson compared his research team's embryo photographs to Ernst Haeckel's 1874 embryo drawings and called Haeckel's work noncredible.Science soon published <“>Haeckel's Embryos: Fraud Rediscovered,<”> and Richardson's comments further reinvigorated criticism of Haeckel by others with articles in The American Biology Teacher, <“>Haeckel's Embryos and Evolution:

In 1997, developmental biologist Michael Richardson compared his research team's embryo photographs to Ernst Haeckel's 1874 embryo drawings and called Haeckel's work noncredible.Science soon published <“>Haeckel's Embryos: Fraud Rediscovered,<”> and Richardson's comments further reinvigorated criticism of Haeckel by others with articles in The American Biology Teacher, <“>Haeckel's Embryos and Evolution: Setting the Record Straight <”> and the New York Times, <“>Biology Text Illustrations more Fiction than Fact.<”> Meanwhile, others emphatically stated that the goal of comparative embryology was not to resurrect Haeckel's work. At the center of the controversy was Haeckel's no-longer-accepted idea of recapitulation. Haeckel believed that the development of an embryo revealed the adult stages of the organism's ancestors. Haeckel represented this idea with drawings of vertebrate embryos at similar developmental stages. This is Haeckel's embryo grid, the most common of all illustrations in biology textbooks. Yet, Haeckel's embryo grids are much more complex than any textbook explanation. I examined 240 high school biology textbooks, from 1907 to 2010, for embryo grids. I coded and categorized the grids according to accompanying discussion of (a) embryonic similarities (b) recapitulation, (c) common ancestors, and (d) evolution. The textbooks show changing narratives. Embryo grids gained prominence in the 1940s, and the trend continued until criticisms of Haeckel reemerged in the late 1990s, resulting in (a) grids with fewer organisms and developmental stages or (b) no grid at all. Discussion about embryos and evolution dropped significantly.
ContributorsWellner, Karen L (Author) / Maienschein, Jane (Thesis advisor) / Ellison, Karin D. (Committee member) / Creath, Richard (Committee member) / Robert, Jason S. (Committee member) / Laubichler, Manfred D. (Committee member) / Arizona State University (Publisher)
Created2014
152492-Thumbnail Image.png
Description
This thesis presents approaches to develop micro seismometers and accelerometers based on molecular electronic transducers (MET) technology using MicroElectroMechanical Systems (MEMS) techniques. MET is a technology applied in seismic instrumentation that proves highly beneficial to planetary seismology. It consists of an electrochemical cell that senses the movement of liquid electrolyte

This thesis presents approaches to develop micro seismometers and accelerometers based on molecular electronic transducers (MET) technology using MicroElectroMechanical Systems (MEMS) techniques. MET is a technology applied in seismic instrumentation that proves highly beneficial to planetary seismology. It consists of an electrochemical cell that senses the movement of liquid electrolyte between electrodes by converting it to the output current. MET seismometers have advantages of high sensitivity, low noise floor, small size, absence of fragile mechanical moving parts and independence on the direction of sensitivity axis. By using MEMS techniques, a micro MET seismometer is developed with inter-electrode spacing close to 1μm, which improves the sensitivity of fabricated device to above 3000 V/(m/s^2) under operating bias of 600 mV and input acceleration of 400 μG (G=9.81m/s^2) at 0.32 Hz. The lowered hydrodynamic resistance by increasing the number of channels improves the self-noise to -127 dB equivalent to 44 nG/√Hz at 1 Hz. An alternative approach to build the sensing element of MEMS MET seismometer using SOI process is also presented in this thesis. The significantly increased number of channels is expected to improve the noise performance. Inspired by the advantages of combining MET and MEMS technologies on the development of seismometer, a low frequency accelerometer utilizing MET technology with post-CMOS-compatible fabrication processes is developed. In the fabricated accelerometer, the complicated fabrication of mass-spring system in solid-state MEMS accelerometer is replaced with a much simpler post-CMOS-compatible process containing only deposition of a four-electrode MET structure on a planar substrate, and a liquid inertia mass of an electrolyte droplet encapsulated by oil film. The fabrication process does not involve focused ion beam milling which is used in the micro MET seismometer fabrication, thus the cost is lowered. Furthermore, the planar structure and the novel idea of using an oil film as the sealing diaphragm eliminate the complicated three-dimensional packaging of the seismometer. The fabricated device achieves 10.8 V/G sensitivity at 20 Hz with nearly flat response over the frequency range from 1 Hz to 50 Hz, and a low noise floor of 75 μG/√Hz at 20 Hz.
ContributorsHuang, Hai (Author) / Yu, Hongyu (Thesis advisor) / Jiang, Hanqing (Committee member) / Dai, Lenore (Committee member) / Si, Jennie (Committee member) / Arizona State University (Publisher)
Created2014
152504-Thumbnail Image.png
Description
Alzheimer's disease (AD) is the most common type of dementia, affecting one in nine people age 65 and older. One of the most important neuropathological characteristics of Alzheimer's disease is the aggregation and deposition of the protein beta-amyloid. Beta-amyloid is produced by proteolytic processing of the Amyloid Precursor Protein (APP).

Alzheimer's disease (AD) is the most common type of dementia, affecting one in nine people age 65 and older. One of the most important neuropathological characteristics of Alzheimer's disease is the aggregation and deposition of the protein beta-amyloid. Beta-amyloid is produced by proteolytic processing of the Amyloid Precursor Protein (APP). Production of beta-amyloid from APP is increased when cells are subject to stress since both APP and beta-secretase are upregulated by stress. An increased beta-amyloid level promotes aggregation of beta-amyloid into toxic species which cause an increase in reactive oxygen species (ROS) and a decrease in cell viability. Therefore reducing beta-amyloid generation is a promising method to control cell damage following stress. The goal of this thesis was to test the effect of inhibiting beta-amyloid production inside stressed AD cell model. Hydrogen peroxide was used as stressing agent. Two treatments were used to inhibit beta-amyloid production, including iBSec1, an scFv designed to block beta-secretase site of APP, and DIA10D, a bispecific tandem scFv engineered to cleave alpha-secretase site of APP and block beta-secretase site of APP. iBSec1 treatment was added extracellularly while DIA10D was stably expressed inside cell using PSECTAG vector. Increase in reactive oxygen species and decrease in cell viability were observed after addition of hydrogen peroxide to AD cell model. The increase in stress induced toxicity caused by addition of hydrogen peroxide was dramatically decreased by simultaneously treating the cells with iBSec1 or DIA10D to block the increase in beta-amyloid levels resulting from the upregulation of APP and beta-secretase.
ContributorsSuryadi, Vicky (Author) / Sierks, Michael (Thesis advisor) / Nielsen, David (Committee member) / Dai, Lenore (Committee member) / Arizona State University (Publisher)
Created2014
152351-Thumbnail Image.png
Description
Lung Cancer Alliance, a nonprofit organization, released the "No One Deserves to Die" advertising campaign in June 2012. The campaign visuals presented a clean, simple message to the public: the stigma associated with lung cancer drives marginalization of lung cancer patients. Lung Cancer Alliance (LCA) asserts that negative public attitude

Lung Cancer Alliance, a nonprofit organization, released the "No One Deserves to Die" advertising campaign in June 2012. The campaign visuals presented a clean, simple message to the public: the stigma associated with lung cancer drives marginalization of lung cancer patients. Lung Cancer Alliance (LCA) asserts that negative public attitude toward lung cancer stems from unacknowledged moral judgments that generate 'stigma.' The campaign materials are meant to expose and challenge these common public category-making processes that occur when subconsciously evaluating lung cancer patients. These processes involve comparison, perception of difference, and exclusion. The campaign implies that society sees suffering of lung cancer patients as indicative of moral failure, thus, not warranting assistance from society, which leads to marginalization of the diseased. Attributing to society a morally laden view of the disease, the campaign extends this view to its logical end and makes it explicit: lung cancer patients no longer deserve to live because they themselves caused the disease (by smoking). This judgment and resulting marginalization is, according to LCA, evident in the ways lung cancer patients are marginalized relative to other diseases via minimal research funding, high- mortality rates and low awareness of the disease. Therefore, society commits an injustice against those with lung cancer. This research analyzes the relationship between disease, identity-making, and responsibilities within society as represented by this stigma framework. LCA asserts that society understands lung cancer in terms of stigma, and advocates that society's understanding of lung cancer should be shifted from a stigma framework toward a medical framework. Analysis of identity-making and responsibility encoded in both frameworks contributes to evaluation of the significance of reframing this disease. One aim of this thesis is to explore the relationship between these frameworks in medical sociology. The results show a complex interaction that suggest trading one frame for another will not destigmatize the lung cancer patient. Those interactions cause tangible harms, such as high mortality rates, and there are important implications for other communities that experience a stigmatized disease.
ContributorsCalvelage, Victoria (Author) / Hurlbut, J. Benjamin (Thesis advisor) / Maienschein, Jane (Committee member) / Ellison, Karin (Committee member) / Arizona State University (Publisher)
Created2013
152962-Thumbnail Image.png
Description
This research focuses on the benefits of using nanocomposites in aerospace structural components to prevent or delay the onset of unique composite failure modes, such as delamination. Analytical, numerical, and experimental analyses were conducted to provide a comprehensive understanding of how carbon nanotubes (CNTs) can provide additional structural integrity when

This research focuses on the benefits of using nanocomposites in aerospace structural components to prevent or delay the onset of unique composite failure modes, such as delamination. Analytical, numerical, and experimental analyses were conducted to provide a comprehensive understanding of how carbon nanotubes (CNTs) can provide additional structural integrity when they are used in specific hot spots within a structure. A multiscale approach was implemented to determine the mechanical and thermal properties of the nanocomposites, which were used in detailed finite element models (FEMs) to analyze interlaminar failures in T and Hat section stringers. The delamination that first occurs between the tow filler and the bondline between the stringer and skin was of particular interest. Both locations are considered to be hot spots in such structural components, and failures tend to initiate from these areas. In this research, nanocomposite use was investigated as an alternative to traditional methods of suppressing delamination. The stringer was analyzed under different loading conditions and assuming different structural defects. Initial damage, defined as the first drop in the load displacement curve was considered to be a useful variable to compare the different behaviors in this study and was detected via the virtual crack closure technique (VCCT) implemented in the FE analysis.

Experiments were conducted to test T section skin/stringer specimens under pull-off loading, replicating those used in composite panels as stiffeners. Two types of designs were considered: one using pure epoxy to fill the tow region and another that used nanocomposite with 5 wt. % CNTs. The response variable in the tests was the initial damage. Detailed analyses were conducted using FEMs to correlate with the experimental data. The correlation between both the experiment and model was satisfactory. Finally, the effects of thermal cure and temperature variation on nanocomposite structure behavior were studied, and both variables were determined to influence the nanocomposite structure performance.
ContributorsHasan, Zeaid (Author) / Chattopadhyay, Aditi (Thesis advisor) / Dai, Lenore (Committee member) / Jiang, Hanqing (Committee member) / Rajadas, John (Committee member) / Liu, Yongming (Committee member) / Arizona State University (Publisher)
Created2014
152838-Thumbnail Image.png
Description
Life Cycle Assessment (LCA) is used in the chemical process sector to compare the environmental merits of different product or process alternatives. One of the tasks that involves much time and cost in LCA studies is the specification of the exact materials and processes modeled which has limited its widespread

Life Cycle Assessment (LCA) is used in the chemical process sector to compare the environmental merits of different product or process alternatives. One of the tasks that involves much time and cost in LCA studies is the specification of the exact materials and processes modeled which has limited its widespread application. To overcome this, researchers have recently created probabilistic underspecification as an LCA streamlining method, which uses a structured data classification system to enable an LCA modeler to specify materials and processes in a less precise manner. This study presents a statistical procedure to understand when streamlined LCA methods can be used, and what their impact on overall model uncertainty is. Petrochemicals and polymer product systems were chosen to examine the impacts of underspecification and mis-specification applied to LCA modeling. Ecoinvent database, extracted using GaBi software, was used for data pertaining to generic crude oil refining and polymer manufacturing modules. By assessing the variation in LCA results arising out of streamlined materials classification, the developed statistics estimate the amount of overall error incurred by underspecifying and mis-specifying material impact data in streamlined LCA. To test the impact of underspecification and mis-specification at the level of a product footprint, case studies of HDPE containers and aerosol air fresheners were conducted. Results indicate that the variation in LCA results decreases as the specificity of materials increases. For the product systems examined, results show that most of the variability in impact assessment is due to the differences in the regions from which the environmental impact datasets were collected; the lower levels of categorization of materials have relatively smaller influence on the variance. Analyses further signify that only certain environmental impact categories viz. global warming potential, freshwater eutrophication, freshwater ecotoxicity, human toxicity and terrestrial ecotoxicity are affected by geographic variations. Outcomes for the case studies point out that the error in the estimation of global warming potential increases as the specificity of a component of the product decreases. Fossil depletion impact estimates remain relatively robust to underspecification. Further, the results of LCA are much more sensitive to underspecification of materials and processes than mis-specification.
ContributorsMurali, Ashwin Krishna (Author) / Dooley, Kevin (Thesis advisor) / Dai, Lenore (Thesis advisor) / Nielsen, David (Committee member) / Arizona State University (Publisher)
Created2014