Matching Items (122)
148201-Thumbnail Image.png
Description

Fluoroquinolone antibiotics have been known to cause severe, multisystem adverse side effects, termed fluoroquinolone toxicity (FQT). This toxicity syndrome can present with adverse effects that vary from individual to individual, including effects on the musculoskeletal and nervous systems, among others. The mechanism behind FQT in mammals is not known, although

Fluoroquinolone antibiotics have been known to cause severe, multisystem adverse side effects, termed fluoroquinolone toxicity (FQT). This toxicity syndrome can present with adverse effects that vary from individual to individual, including effects on the musculoskeletal and nervous systems, among others. The mechanism behind FQT in mammals is not known, although various possibilities have been investigated. Among the hypothesized FQT mechanisms, those that could potentially explain multisystem toxicity include off-target mammalian topoisomerase interactions, increased production of reactive oxygen species, oxidative stress, and oxidative damage, as well as metal chelating properties of FQs. This review presents relevant information on fluoroquinolone antibiotics and FQT and explores the mechanisms that have been proposed. A fluoroquinolone-induced increase in reactive oxygen species and subsequent oxidative stress and damage presents the strongest evidence to explain this multisystem toxicity syndrome. Understanding the mechanism of FQT in mammals is important to aid in the prevention and treatment of this condition.

ContributorsHall, Brooke Ashlyn (Author) / Redding, Kevin (Thesis director) / Wideman, Jeremy (Committee member) / Borges, Chad (Committee member) / School of Molecular Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
150231-Thumbnail Image.png
Description
In this thesis I introduce a new direction to computing using nonlinear chaotic dynamics. The main idea is rich dynamics of a chaotic system enables us to (1) build better computers that have a flexible instruction set, and (2) carry out computation that conventional computers are not good at it.

In this thesis I introduce a new direction to computing using nonlinear chaotic dynamics. The main idea is rich dynamics of a chaotic system enables us to (1) build better computers that have a flexible instruction set, and (2) carry out computation that conventional computers are not good at it. Here I start from the theory, explaining how one can build a computing logic block using a chaotic system, and then I introduce a new theoretical analysis for chaos computing. Specifically, I demonstrate how unstable periodic orbits and a model based on them explains and predicts how and how well a chaotic system can do computation. Furthermore, since unstable periodic orbits and their stability measures in terms of eigenvalues are extractable from experimental times series, I develop a time series technique for modeling and predicting chaos computing from a given time series of a chaotic system. After building a theoretical framework for chaos computing I proceed to architecture of these chaos-computing blocks to build a sophisticated computing system out of them. I describe how one can arrange and organize these chaos-based blocks to build a computer. I propose a brand new computer architecture using chaos computing, which shifts the limits of conventional computers by introducing flexible instruction set. Our new chaos based computer has a flexible instruction set, meaning that the user can load its desired instruction set to the computer to reconfigure the computer to be an implementation for the desired instruction set. Apart from direct application of chaos theory in generic computation, the application of chaos theory to speech processing is explained and a novel application for chaos theory in speech coding and synthesizing is introduced. More specifically it is demonstrated how a chaotic system can model the natural turbulent flow of the air in the human speech production system and how chaotic orbits can be used to excite a vocal tract model. Also as another approach to build computing system based on nonlinear system, the idea of Logical Stochastic Resonance is studied and adapted to an autoregulatory gene network in the bacteriophage λ.
ContributorsKia, Behnam (Author) / Ditto, William (Thesis advisor) / Huang, Liang (Committee member) / Lai, Ying-Cheng (Committee member) / Helms Tillery, Stephen (Committee member) / Arizona State University (Publisher)
Created2011
150288-Thumbnail Image.png
Description
In an effort to begin validating the large number of discovered candidate biomarkers, proteomics is beginning to shift from shotgun proteomic experiments towards targeted proteomic approaches that provide solutions to automation and economic concerns. Such approaches to validate biomarkers necessitate the mass spectrometric analysis of hundreds to thousands of human

In an effort to begin validating the large number of discovered candidate biomarkers, proteomics is beginning to shift from shotgun proteomic experiments towards targeted proteomic approaches that provide solutions to automation and economic concerns. Such approaches to validate biomarkers necessitate the mass spectrometric analysis of hundreds to thousands of human samples. As this takes place, a serendipitous opportunity has become evident. By the virtue that as one narrows the focus towards "single" protein targets (instead of entire proteomes) using pan-antibody-based enrichment techniques, a discovery science has emerged, so to speak. This is due to the largely unknown context in which "single" proteins exist in blood (i.e. polymorphisms, transcript variants, and posttranslational modifications) and hence, targeted proteomics has applications for established biomarkers. Furthermore, besides protein heterogeneity accounting for interferences with conventional immunometric platforms, it is becoming evident that this formerly hidden dimension of structural information also contains rich-pathobiological information. Consequently, targeted proteomics studies that aim to ascertain a protein's genuine presentation within disease- stratified populations and serve as a stepping-stone within a biomarker translational pipeline are of clinical interest. Roughly 128 million Americans are pre-diabetic, diabetic, and/or have kidney disease and public and private spending for treating these diseases is in the hundreds of billions of dollars. In an effort to create new solutions for the early detection and management of these conditions, described herein is the design, development, and translation of mass spectrometric immunoassays targeted towards diabetes and kidney disease. Population proteomics experiments were performed for the following clinically relevant proteins: insulin, C-peptide, RANTES, and parathyroid hormone. At least thirty-eight protein isoforms were detected. Besides the numerous disease correlations confronted within the disease-stratified cohorts, certain isoforms also appeared to be causally related to the underlying pathophysiology and/or have therapeutic implications. Technical advancements include multiplexed isoform quantification as well a "dual- extraction" methodology for eliminating non-specific proteins while simultaneously validating isoforms. Industrial efforts towards widespread clinical adoption are also described. Consequently, this work lays a foundation for the translation of mass spectrometric immunoassays into the clinical arena and simultaneously presents the most recent advancements concerning the mass spectrometric immunoassay approach.
ContributorsOran, Paul (Author) / Nelson, Randall (Thesis advisor) / Hayes, Mark (Thesis advisor) / Ros, Alexandra (Committee member) / Williams, Peter (Committee member) / Arizona State University (Publisher)
Created2011
152165-Thumbnail Image.png
Description
Surgery as a profession requires significant training to improve both clinical decision making and psychomotor proficiency. In the medical knowledge domain, tools have been developed, validated, and accepted for evaluation of surgeons' competencies. However, assessment of the psychomotor skills still relies on the Halstedian model of apprenticeship, wherein surgeons are

Surgery as a profession requires significant training to improve both clinical decision making and psychomotor proficiency. In the medical knowledge domain, tools have been developed, validated, and accepted for evaluation of surgeons' competencies. However, assessment of the psychomotor skills still relies on the Halstedian model of apprenticeship, wherein surgeons are observed during residency for judgment of their skills. Although the value of this method of skills assessment cannot be ignored, novel methodologies of objective skills assessment need to be designed, developed, and evaluated that augment the traditional approach. Several sensor-based systems have been developed to measure a user's skill quantitatively, but use of sensors could interfere with skill execution and thus limit the potential for evaluating real-life surgery. However, having a method to judge skills automatically in real-life conditions should be the ultimate goal, since only with such features that a system would be widely adopted. This research proposes a novel video-based approach for observing surgeons' hand and surgical tool movements in minimally invasive surgical training exercises as well as during laparoscopic surgery. Because our system does not require surgeons to wear special sensors, it has the distinct advantage over alternatives of offering skills assessment in both learning and real-life environments. The system automatically detects major skill-measuring features from surgical task videos using a computing system composed of a series of computer vision algorithms and provides on-screen real-time performance feedback for more efficient skill learning. Finally, the machine-learning approach is used to develop an observer-independent composite scoring model through objective and quantitative measurement of surgical skills. To increase effectiveness and usability of the developed system, it is integrated with a cloud-based tool, which automatically assesses surgical videos upload to the cloud.
ContributorsIslam, Gazi (Author) / Li, Baoxin (Thesis advisor) / Liang, Jianming (Thesis advisor) / Dinu, Valentin (Committee member) / Greenes, Robert (Committee member) / Smith, Marshall (Committee member) / Kahol, Kanav (Committee member) / Patel, Vimla L. (Committee member) / Arizona State University (Publisher)
Created2013
151436-Thumbnail Image.png
Description
Signal processing techniques have been used extensively in many engineering problems and in recent years its application has extended to non-traditional research fields such as biological systems. Many of these applications require extraction of a signal or parameter of interest from degraded measurements. One such application is mass spectrometry immunoassay

Signal processing techniques have been used extensively in many engineering problems and in recent years its application has extended to non-traditional research fields such as biological systems. Many of these applications require extraction of a signal or parameter of interest from degraded measurements. One such application is mass spectrometry immunoassay (MSIA) which has been one of the primary methods of biomarker discovery techniques. MSIA analyzes protein molecules as potential biomarkers using time of flight mass spectrometry (TOF-MS). Peak detection in TOF-MS is important for biomarker analysis and many other MS related application. Though many peak detection algorithms exist, most of them are based on heuristics models. One of the ways of detecting signal peaks is by deploying stochastic models of the signal and noise observations. Likelihood ratio test (LRT) detector, based on the Neyman-Pearson (NP) lemma, is an uniformly most powerful test to decision making in the form of a hypothesis test. The primary goal of this dissertation is to develop signal and noise models for the electrospray ionization (ESI) TOF-MS data. A new method is proposed for developing the signal model by employing first principles calculations based on device physics and molecular properties. The noise model is developed by analyzing MS data from careful experiments in the ESI mass spectrometer. A non-flat baseline in MS data is common. The reasons behind the formation of this baseline has not been fully comprehended. A new signal model explaining the presence of baseline is proposed, though detailed experiments are needed to further substantiate the model assumptions. Signal detection schemes based on these signal and noise models are proposed. A maximum likelihood (ML) method is introduced for estimating the signal peak amplitudes. The performance of the detection methods and ML estimation are evaluated with Monte Carlo simulation which shows promising results. An application of these methods is proposed for fractional abundance calculation for biomarker analysis, which is mathematically robust and fundamentally different than the current algorithms. Biomarker panels for type 2 diabetes and cardiovascular disease are analyzed using existing MS analysis algorithms. Finally, a support vector machine based multi-classification algorithm is developed for evaluating the biomarkers' effectiveness in discriminating type 2 diabetes and cardiovascular diseases and is shown to perform better than a linear discriminant analysis based classifier.
ContributorsBuddi, Sai (Author) / Taylor, Thomas (Thesis advisor) / Cochran, Douglas (Thesis advisor) / Nelson, Randall (Committee member) / Duman, Tolga (Committee member) / Arizona State University (Publisher)
Created2012
152123-Thumbnail Image.png
Description
This dissertation investigates the condition of skeletal muscle insulin resistance using bioinformatics and computational biology approaches. Drawing from several studies and numerous data sources, I have attempted to uncover molecular mechanisms at multiple levels. From the detailed atomistic simulations of a single protein, to datamining approaches applied at the systems

This dissertation investigates the condition of skeletal muscle insulin resistance using bioinformatics and computational biology approaches. Drawing from several studies and numerous data sources, I have attempted to uncover molecular mechanisms at multiple levels. From the detailed atomistic simulations of a single protein, to datamining approaches applied at the systems biology level, I provide new targets to explore for the research community. Furthermore I present a new online web resource that unifies various bioinformatics databases to enable discovery of relevant features in 3D protein structures.
ContributorsMielke, Clinton (Author) / Mandarino, Lawrence (Committee member) / LaBaer, Joshua (Committee member) / Magee, D. Mitchell (Committee member) / Dinu, Valentin (Committee member) / Willis, Wayne (Committee member) / Arizona State University (Publisher)
Created2013
150897-Thumbnail Image.png
Description
The living world we inhabit and observe is extraordinarily complex. From the perspective of a person analyzing data about the living world, complexity is most commonly encountered in two forms: 1) in the sheer size of the datasets that must be analyzed and the physical number of mathematical computations necessary

The living world we inhabit and observe is extraordinarily complex. From the perspective of a person analyzing data about the living world, complexity is most commonly encountered in two forms: 1) in the sheer size of the datasets that must be analyzed and the physical number of mathematical computations necessary to obtain an answer and 2) in the underlying structure of the data, which does not conform to classical normal theory statistical assumptions and includes clustering and unobserved latent constructs. Until recently, the methods and tools necessary to effectively address the complexity of biomedical data were not ordinarily available. The utility of four methods--High Performance Computing, Monte Carlo Simulations, Multi-Level Modeling and Structural Equation Modeling--designed to help make sense of complex biomedical data are presented here.
ContributorsBrown, Justin Reed (Author) / Dinu, Valentin (Thesis advisor) / Johnson, William (Committee member) / Petitti, Diana (Committee member) / Arizona State University (Publisher)
Created2012
150551-Thumbnail Image.png
Description
Complex dynamical systems consisting interacting dynamical units are ubiquitous in nature and society. Predicting and reconstructing nonlinear dynamics of units and the complex interacting networks among them serves the base for the understanding of a variety of collective dynamical phenomena. I present a general method to address the two outstanding

Complex dynamical systems consisting interacting dynamical units are ubiquitous in nature and society. Predicting and reconstructing nonlinear dynamics of units and the complex interacting networks among them serves the base for the understanding of a variety of collective dynamical phenomena. I present a general method to address the two outstanding problems as a whole based solely on time-series measurements. The method is implemented by incorporating compressive sensing approach that enables an accurate reconstruction of complex dynamical systems in terms of both nodal equations that determines the self-dynamics of units and detailed coupling patterns among units. The representative advantages of the approach are (i) the sparse data requirement which allows for a successful reconstruction from limited measurements, and (ii) general applicability to identical and nonidentical nodal dynamics, and to networks with arbitrary interacting structure, strength and sizes. Another two challenging problem of significant interest in nonlinear dynamics: (i) predicting catastrophes in nonlinear dynamical systems in advance of their occurrences and (ii) predicting the future state for time-varying nonlinear dynamical systems, can be formulated and solved in the framework of compressive sensing using only limited measurements. Once the network structure can be inferred, the dynamics behavior on them can be investigated, for example optimize information spreading dynamics, suppress cascading dynamics and traffic congestion, enhance synchronization, game dynamics, etc. The results can yield insights to control strategies design in the real-world social and natural systems. Since 2004, there has been a tremendous amount of interest in graphene. The most amazing feature of graphene is that there exists linear energy-momentum relationship when energy is low. The quasi-particles inside the system can be treated as chiral, massless Dirac fermions obeying relativistic quantum mechanics. Therefore, the graphene provides one perfect test bed to investigate relativistic quantum phenomena, such as relativistic quantum chaotic scattering and abnormal electron paths induced by klein tunneling. This phenomenon has profound implications to the development of graphene based devices that require stable electronic properties.
ContributorsYang, Rui (Author) / Lai, Ying-Cheng (Thesis advisor) / Duman, Tolga M. (Committee member) / Akis, Richard (Committee member) / Huang, Liang (Committee member) / Arizona State University (Publisher)
Created2012
151020-Thumbnail Image.png
Description
Critical care environments are complex in nature. Fluctuating team dynamics and the plethora of technology and equipment create unforeseen demands on clinicians. Such environments become chaotic very quickly due to the chronic exposure to unpredictable clusters of events. In order to cope with this complexity, clinicians tend to develop ad-hoc

Critical care environments are complex in nature. Fluctuating team dynamics and the plethora of technology and equipment create unforeseen demands on clinicians. Such environments become chaotic very quickly due to the chronic exposure to unpredictable clusters of events. In order to cope with this complexity, clinicians tend to develop ad-hoc adaptations to function in an effective manner. It is these adaptations or "deviations" from expected behaviors that provide insight into the processes that shape the overall behavior of the complex system. The research described in this manuscript examines the cognitive basis of clinicians' adaptive mechanisms and presents a methodology for studying the same. Examining interactions in complex systems is difficult due to the disassociation between the nature of the environment and the tools available to analyze underlying processes. In this work, the use of a mixed methodology framework to study trauma critical care, a complex environment, is presented. The hybrid framework supplements existing methods of data collection (qualitative observations) with quantitative methods (use of electronic tags) to capture activities in the complex system. Quantitative models of activities (using Hidden Markov Modeling) and theoretical models of deviations were developed to support this mixed methodology framework. The quantitative activity models developed were tested with a set of fifteen simulated activities that represent workflow in trauma care. A mean recognition rate of 87.5% was obtained in automatically recognizing activities. Theoretical models, on the other hand, were developed using field observations of 30 trauma cases. The analysis of the classification schema (with substantial inter-rater reliability) and 161 deviations identified shows that expertise and role played by the clinician in the trauma team influences the nature of deviations made (p<0.01). The results shows that while expert clinicians deviate to innovate, deviations of novices often result in errors. Experts' flexibility and adaptiveness allow their deviations to generate innovative ideas, in particular when dynamic adjustments are required in complex situations. The findings suggest that while adherence to protocols and standards is important for novice practitioners to reduce medical errors and ensure patient safety, there is strong need for training novices in coping with complex situations as well.
ContributorsVankipuram, Mithra (Author) / Greenes, Robert A (Thesis advisor) / Patel, Vimla L. (Thesis advisor) / Petitti, Diana B. (Committee member) / Dinu, Valentin (Committee member) / Smith, Marshall L. (Committee member) / Arizona State University (Publisher)
Created2012
150708-Thumbnail Image.png
Description
This work involved the analysis of a public health system, and the design, development and deployment of enterprise informatics architecture, and sustainable community methods to address problems with the current public health system. Specifically, assessment of the Nationally Notifiable Disease Surveillance System (NNDSS) was instrumental in forming the design of

This work involved the analysis of a public health system, and the design, development and deployment of enterprise informatics architecture, and sustainable community methods to address problems with the current public health system. Specifically, assessment of the Nationally Notifiable Disease Surveillance System (NNDSS) was instrumental in forming the design of the current implementation at the Southern Nevada Health District (SNHD). The result of the system deployment at SNHD was considered as a basis for projecting the practical application and benefits of an enterprise architecture. This approach has resulted in a sustainable platform to enhance the practice of public health by improving the quality and timeliness of data, effectiveness of an investigation, and reporting across the continuum.
ContributorsKriseman, Jeffrey Michael (Author) / Dinu, Valentin (Thesis advisor) / Greenes, Robert (Committee member) / Johnson, William (Committee member) / Arizona State University (Publisher)
Created2012