Matching Items (63)
152165-Thumbnail Image.png
Description
Surgery as a profession requires significant training to improve both clinical decision making and psychomotor proficiency. In the medical knowledge domain, tools have been developed, validated, and accepted for evaluation of surgeons' competencies. However, assessment of the psychomotor skills still relies on the Halstedian model of apprenticeship, wherein surgeons are

Surgery as a profession requires significant training to improve both clinical decision making and psychomotor proficiency. In the medical knowledge domain, tools have been developed, validated, and accepted for evaluation of surgeons' competencies. However, assessment of the psychomotor skills still relies on the Halstedian model of apprenticeship, wherein surgeons are observed during residency for judgment of their skills. Although the value of this method of skills assessment cannot be ignored, novel methodologies of objective skills assessment need to be designed, developed, and evaluated that augment the traditional approach. Several sensor-based systems have been developed to measure a user's skill quantitatively, but use of sensors could interfere with skill execution and thus limit the potential for evaluating real-life surgery. However, having a method to judge skills automatically in real-life conditions should be the ultimate goal, since only with such features that a system would be widely adopted. This research proposes a novel video-based approach for observing surgeons' hand and surgical tool movements in minimally invasive surgical training exercises as well as during laparoscopic surgery. Because our system does not require surgeons to wear special sensors, it has the distinct advantage over alternatives of offering skills assessment in both learning and real-life environments. The system automatically detects major skill-measuring features from surgical task videos using a computing system composed of a series of computer vision algorithms and provides on-screen real-time performance feedback for more efficient skill learning. Finally, the machine-learning approach is used to develop an observer-independent composite scoring model through objective and quantitative measurement of surgical skills. To increase effectiveness and usability of the developed system, it is integrated with a cloud-based tool, which automatically assesses surgical videos upload to the cloud.
ContributorsIslam, Gazi (Author) / Li, Baoxin (Thesis advisor) / Liang, Jianming (Thesis advisor) / Dinu, Valentin (Committee member) / Greenes, Robert (Committee member) / Smith, Marshall (Committee member) / Kahol, Kanav (Committee member) / Patel, Vimla L. (Committee member) / Arizona State University (Publisher)
Created2013
152019-Thumbnail Image.png
Description
In this thesis, we present the study of several physical properties of relativistic mat- ters under extreme conditions. We start by deriving the rate of the nonleptonic weak processes and the bulk viscosity in several spin-one color superconducting phases of quark matter. We also calculate the bulk viscosity in the

In this thesis, we present the study of several physical properties of relativistic mat- ters under extreme conditions. We start by deriving the rate of the nonleptonic weak processes and the bulk viscosity in several spin-one color superconducting phases of quark matter. We also calculate the bulk viscosity in the nonlinear and anharmonic regime in the normal phase of strange quark matter. We point out several qualitative effects due to the anharmonicity, although quantitatively they appear to be relatively small. In the corresponding study, we take into account the interplay between the non- leptonic and semileptonic weak processes. The results can be important in order to relate accessible observables of compact stars to their internal composition. We also use quantum field theoretical methods to study the transport properties in monolayer graphene in a strong magnetic field. The corresponding quasi-relativistic system re- veals an anomalous quantum Hall effect, whose features are directly connected with the spontaneous flavor symmetry breaking. We study the microscopic origin of Fara- day rotation and magneto-optical transmission in graphene and show that their main features are in agreement with the experimental data.
ContributorsWang, Xinyang, Ph.D (Author) / Shovkovy, Igor (Thesis advisor) / Belitsky, Andrei (Committee member) / Easson, Damien (Committee member) / Peng, Xihong (Committee member) / Vachaspati, Tanmay (Committee member) / Arizona State University (Publisher)
Created2013
152123-Thumbnail Image.png
Description
This dissertation investigates the condition of skeletal muscle insulin resistance using bioinformatics and computational biology approaches. Drawing from several studies and numerous data sources, I have attempted to uncover molecular mechanisms at multiple levels. From the detailed atomistic simulations of a single protein, to datamining approaches applied at the systems

This dissertation investigates the condition of skeletal muscle insulin resistance using bioinformatics and computational biology approaches. Drawing from several studies and numerous data sources, I have attempted to uncover molecular mechanisms at multiple levels. From the detailed atomistic simulations of a single protein, to datamining approaches applied at the systems biology level, I provide new targets to explore for the research community. Furthermore I present a new online web resource that unifies various bioinformatics databases to enable discovery of relevant features in 3D protein structures.
ContributorsMielke, Clinton (Author) / Mandarino, Lawrence (Committee member) / LaBaer, Joshua (Committee member) / Magee, D. Mitchell (Committee member) / Dinu, Valentin (Committee member) / Willis, Wayne (Committee member) / Arizona State University (Publisher)
Created2013
150890-Thumbnail Image.png
Description
Numerical simulations are very helpful in understanding the physics of the formation of structure and galaxies. However, it is sometimes difficult to interpret model data with respect to observations, partly due to the difficulties and background noise inherent to observation. The goal, here, is to attempt to bridge this ga

Numerical simulations are very helpful in understanding the physics of the formation of structure and galaxies. However, it is sometimes difficult to interpret model data with respect to observations, partly due to the difficulties and background noise inherent to observation. The goal, here, is to attempt to bridge this gap between simulation and observation by rendering the model output in image format which is then processed by tools commonly used in observational astronomy. Images are synthesized in various filters by folding the output of cosmological simulations of gasdynamics with star-formation and dark matter with the Bruzual- Charlot stellar population synthesis models. A variation of the Virgo-Gadget numerical simulation code is used with the hybrid gas and stellar formation models of Springel and Hernquist (2003). Outputs taken at various redshifts are stacked to create a synthetic view of the simulated star clusters. Source Extractor (SExtractor) is used to find groupings of stellar populations which are considered as galaxies or galaxy building blocks and photometry used to estimate the rest frame luminosities and distribution functions. With further refinements, this is expected to provide support for missions such as JWST, as well as to probe what additional physics are needed to model the data. The results show good agreement in many respects with observed properties of the galaxy luminosity function (LF) over a wide range of high redshifts. In particular, the slope (alpha) when fitted to the standard Schechter function shows excellent agreement both in value and evolution with redshift, when compared with observation. Discrepancies of other properties with observation are seen to be a result of limitations of the simulation and additional feedback mechanisms which are needed.
ContributorsMorgan, Robert (Author) / Windhorst, Rogier A (Thesis advisor) / Scannapieco, Evan (Committee member) / Rhoads, James (Committee member) / Gardner, Carl (Committee member) / Belitsky, Andrei (Committee member) / Arizona State University (Publisher)
Created2012
150897-Thumbnail Image.png
Description
The living world we inhabit and observe is extraordinarily complex. From the perspective of a person analyzing data about the living world, complexity is most commonly encountered in two forms: 1) in the sheer size of the datasets that must be analyzed and the physical number of mathematical computations necessary

The living world we inhabit and observe is extraordinarily complex. From the perspective of a person analyzing data about the living world, complexity is most commonly encountered in two forms: 1) in the sheer size of the datasets that must be analyzed and the physical number of mathematical computations necessary to obtain an answer and 2) in the underlying structure of the data, which does not conform to classical normal theory statistical assumptions and includes clustering and unobserved latent constructs. Until recently, the methods and tools necessary to effectively address the complexity of biomedical data were not ordinarily available. The utility of four methods--High Performance Computing, Monte Carlo Simulations, Multi-Level Modeling and Structural Equation Modeling--designed to help make sense of complex biomedical data are presented here.
ContributorsBrown, Justin Reed (Author) / Dinu, Valentin (Thesis advisor) / Johnson, William (Committee member) / Petitti, Diana (Committee member) / Arizona State University (Publisher)
Created2012
150947-Thumbnail Image.png
Description
Understanding the temperature structure of protoplanetary disks (PPDs) is paramount to modeling disk evolution and future planet formation. PPDs around T Tauri stars have two primary heating sources, protostellar irradiation, which depends on the flaring of the disk, and accretional heating as viscous coupling between annuli dissipate energy. I have

Understanding the temperature structure of protoplanetary disks (PPDs) is paramount to modeling disk evolution and future planet formation. PPDs around T Tauri stars have two primary heating sources, protostellar irradiation, which depends on the flaring of the disk, and accretional heating as viscous coupling between annuli dissipate energy. I have written a "1.5-D" radiative transfer code to calculate disk temperatures assuming hydrostatic and radiative equilibrium. The model solves for the temperature at all locations simultaneously using Rybicki's method, converges rapidly at high optical depth, and retains full frequency dependence. The likely cause of accretional heating in PPDs is the magnetorotational instability (MRI), which acts where gas ionization is sufficiently high for gas to couple to the magnetic field. This will occur in surface layers of the disk, leaving the interior portions of the disk inactive ("dead zone"). I calculate temperatures in PPDs undergoing such "layered accretion." Since the accretional heating is concentrated far from the midplane, temperatures in the disk's interior are lower than in PPDs modeled with vertically uniform accretion. The method is used to study for the first time disks evolving via the magnetorotational instability, which operates primarily in surface layers. I find that temperatures in layered accretion disks do not significantly differ from those of "passive disks," where no accretional heating exists. Emergent spectra are insensitive to active layer thickness, making it difficult to observationally identify disks undergoing layered vs. uniform accretion. I also calculate the ionization chemistry in PPDs, using an ionization network including multiple charge states of dust grains. Combined with a criterion for the onset of the MRI, I calculate where the MRI can be initiated and the extent of dead zones in PPDs. After accounting for feedback between temperature and active layer thickness, I find the surface density of the actively accreting layers falls rapidly with distance from the protostar, leading to a net outward flow of mass from ~0.1 to 3 AU. The clearing out of the innermost zones is possibly consistent with the observed behavior of recently discovered "transition disks."
ContributorsLesniak, Michael V., III (Author) / Desch, Steven J. (Thesis advisor) / Scannapieco, Evan (Committee member) / Timmes, Francis (Committee member) / Starrfield, Sumner (Committee member) / Belitsky, Andrei (Committee member) / Arizona State University (Publisher)
Created2012
150778-Thumbnail Image.png
Description
This thesis deals with the first measurements done with a cold neutron beam at the Spallation Neutron Source at Oak Ridge National Laboratory. The experimental technique consisted of capturing polarized cold neutrons by nuclei to measure parity-violation in the angular distribution of the gamma rays following neutron capture. The measurements

This thesis deals with the first measurements done with a cold neutron beam at the Spallation Neutron Source at Oak Ridge National Laboratory. The experimental technique consisted of capturing polarized cold neutrons by nuclei to measure parity-violation in the angular distribution of the gamma rays following neutron capture. The measurements presented here for the nuclei Chlorine ( 35Cl) and Aluminum ( 27Al ) are part of a program with the ultimate goal of measuring the asymmetry in the angular distribution of gamma rays emitted in the capture of neutrons on protons, with a precision better than 10-8, in order to extract the weak hadronic coupling constant due to pion exchange interaction with isospin change equal with one ( hπ 1). Based on theoretical calculations asymmetry in the angular distribution of the gamma rays from neutron capture on protons has an estimated size of 5·10-8. This implies that the Al parity violation asymmetry and its uncertainty have to be known with a precision smaller than 4·10-8. The proton target is liquid Hydrogen (H2) contained in an Aluminum vessel. Results are presented for parity violation and parity-conserving asymmetries in Chlorine and Aluminum. The systematic and statistical uncertainties in the calculation of the parity-violating and parity-conserving asymmetries are discussed.
ContributorsBalascuta, Septimiu (Author) / Alarcon, Ricardo (Thesis advisor) / Belitsky, Andrei (Committee member) / Doak, Bruce (Committee member) / Comfort, Joseph (Committee member) / Schmidt, Kevin (Committee member) / Arizona State University (Publisher)
Created2012
151020-Thumbnail Image.png
Description
Critical care environments are complex in nature. Fluctuating team dynamics and the plethora of technology and equipment create unforeseen demands on clinicians. Such environments become chaotic very quickly due to the chronic exposure to unpredictable clusters of events. In order to cope with this complexity, clinicians tend to develop ad-hoc

Critical care environments are complex in nature. Fluctuating team dynamics and the plethora of technology and equipment create unforeseen demands on clinicians. Such environments become chaotic very quickly due to the chronic exposure to unpredictable clusters of events. In order to cope with this complexity, clinicians tend to develop ad-hoc adaptations to function in an effective manner. It is these adaptations or "deviations" from expected behaviors that provide insight into the processes that shape the overall behavior of the complex system. The research described in this manuscript examines the cognitive basis of clinicians' adaptive mechanisms and presents a methodology for studying the same. Examining interactions in complex systems is difficult due to the disassociation between the nature of the environment and the tools available to analyze underlying processes. In this work, the use of a mixed methodology framework to study trauma critical care, a complex environment, is presented. The hybrid framework supplements existing methods of data collection (qualitative observations) with quantitative methods (use of electronic tags) to capture activities in the complex system. Quantitative models of activities (using Hidden Markov Modeling) and theoretical models of deviations were developed to support this mixed methodology framework. The quantitative activity models developed were tested with a set of fifteen simulated activities that represent workflow in trauma care. A mean recognition rate of 87.5% was obtained in automatically recognizing activities. Theoretical models, on the other hand, were developed using field observations of 30 trauma cases. The analysis of the classification schema (with substantial inter-rater reliability) and 161 deviations identified shows that expertise and role played by the clinician in the trauma team influences the nature of deviations made (p<0.01). The results shows that while expert clinicians deviate to innovate, deviations of novices often result in errors. Experts' flexibility and adaptiveness allow their deviations to generate innovative ideas, in particular when dynamic adjustments are required in complex situations. The findings suggest that while adherence to protocols and standards is important for novice practitioners to reduce medical errors and ensure patient safety, there is strong need for training novices in coping with complex situations as well.
ContributorsVankipuram, Mithra (Author) / Greenes, Robert A (Thesis advisor) / Patel, Vimla L. (Thesis advisor) / Petitti, Diana B. (Committee member) / Dinu, Valentin (Committee member) / Smith, Marshall L. (Committee member) / Arizona State University (Publisher)
Created2012
150708-Thumbnail Image.png
Description
This work involved the analysis of a public health system, and the design, development and deployment of enterprise informatics architecture, and sustainable community methods to address problems with the current public health system. Specifically, assessment of the Nationally Notifiable Disease Surveillance System (NNDSS) was instrumental in forming the design of

This work involved the analysis of a public health system, and the design, development and deployment of enterprise informatics architecture, and sustainable community methods to address problems with the current public health system. Specifically, assessment of the Nationally Notifiable Disease Surveillance System (NNDSS) was instrumental in forming the design of the current implementation at the Southern Nevada Health District (SNHD). The result of the system deployment at SNHD was considered as a basis for projecting the practical application and benefits of an enterprise architecture. This approach has resulted in a sustainable platform to enhance the practice of public health by improving the quality and timeliness of data, effectiveness of an investigation, and reporting across the continuum.
ContributorsKriseman, Jeffrey Michael (Author) / Dinu, Valentin (Thesis advisor) / Greenes, Robert (Committee member) / Johnson, William (Committee member) / Arizona State University (Publisher)
Created2012
151234-Thumbnail Image.png
Description
Immunosignaturing is a technology that allows the humoral immune response to be observed through the binding of antibodies to random sequence peptides. The immunosignaturing microarray is based on complex mixtures of antibodies binding to arrays of random sequence peptides in a multiplexed fashion. There are computational and statistical challenges to

Immunosignaturing is a technology that allows the humoral immune response to be observed through the binding of antibodies to random sequence peptides. The immunosignaturing microarray is based on complex mixtures of antibodies binding to arrays of random sequence peptides in a multiplexed fashion. There are computational and statistical challenges to the analysis of immunosignaturing data. The overall aim of my dissertation is to develop novel computational and statistical methods for immunosignaturing data to access its potential for diagnostics and drug discovery. Firstly, I discovered that a classification algorithm Naive Bayes which leverages the biological independence of the probes on our array in such a way as to gather more information outperforms other classification algorithms due to speed and accuracy. Secondly, using this classifier, I then tested the specificity and sensitivity of immunosignaturing platform for its ability to resolve four different diseases (pancreatic cancer, pancreatitis, type 2 diabetes and panIN) that target the same organ (pancreas). These diseases were separated with >90% specificity from controls and from each other. Thirdly, I observed that the immunosignature of type 2 diabetes and cardiovascular complications are unique, consistent, and reproducible and can be separated by 100% accuracy from controls. But when these two complications arise in the same person, the resultant immunosignature is quite different in that of individuals with only one disease. I developed a method to trace back from informative random peptides in disease signatures to the potential antigen(s). Hence, I built a decipher system to trace random peptides in type 1 diabetes immunosignature to known antigens. Immunosignaturing, unlike the ELISA, has the ability to not only detect the presence of response but also absence of response during a disease. I observed, not only higher but also lower peptides intensities can be mapped to antigens in type 1 diabetes. To study immunosignaturing potential for population diagnostics, I studied effect of age, gender and geographical location on immunosignaturing data. For its potential to be a health monitoring technology, I proposed a single metric Coefficient of Variation that has shown potential to change significantly when a person enters a disease state.
ContributorsKukreja, Muskan (Author) / Johnston, Stephen Albert (Thesis advisor) / Stafford, Phillip (Committee member) / Dinu, Valentin (Committee member) / Arizona State University (Publisher)
Created2012