Matching Items (2)
Filtering by

Clear all filters

155382-Thumbnail Image.png
Description
ABSTRACT

Results from previous studies indicated nursing students needed to further develop critical thinking (CT) especially with respect to employing it in their clinical reasoning. Thus, the study was conducted to support development of students’ CT in the areas of inference subskills that could be applied as they engaged in clinical

ABSTRACT

Results from previous studies indicated nursing students needed to further develop critical thinking (CT) especially with respect to employing it in their clinical reasoning. Thus, the study was conducted to support development of students’ CT in the areas of inference subskills that could be applied as they engaged in clinical reasoning during course simulations. Relevant studies from areas such as CT, clinical reasoning, nursing process, and inference subskills informed the study. Additionally, the power of simulation as an instructional technique along with reflection on those simulations contributed to the formulation of the study. Participants included junior nursing students in their second semester of nursing school. They completed a pre- and post-intervention Critical Thinking Survey, reflective journals during the course of the intervention, and interviews as the conclusion of the study. The intervention provided students with instruction on the use of three inference subskills (Facione, 2015). Moreover, they wrote reflective journal entries about their use of these skills. Quantitative results indicated no changes in various CT measures. By comparison, qualitative data analysis of individual interviews and reflective journals showed students: applied inference subskills in a limited way; demonstrated restricted clinical reasoning; displayed emerging reflection skills; and established a foundation on which to build additional CT in their professional roles. Limitations of the study included time—length of the intervention and limited power of the instruction—depth of the instruction with respect to teaching the inference subskills. Discussion focused on explaining the results. Implications for teaching included revision of the instruction in inference subskills to be more robust by extending it over time, perhaps across courses. Additionally, use of a ‘flipped’ instructional process was discussed in which students would learn the subskills by viewing video modules prior to class and then are ‘guided’ to apply their learning in classroom health care simulations. Implications for research included closer examination of the development of CT in clinical reasoning to devise a developmental trajectory that might be useful to understand this phenomenon and to develop teaching strategies to assist students in learning to use these skills as part of the clinical reasoning process.
ContributorsLuPone, Kathleen A (Author) / Buss, Ray R (Thesis advisor) / Mertler, Craig A. (Committee member) / Heying-Stanley, Betty (Committee member) / Arizona State University (Publisher)
Created2017
161833-Thumbnail Image.png
Description
The meteoric rise of Deep Neural Networks (DNN) has led to the development of various Machine Learning (ML) frameworks (e.g., Tensorflow, PyTorch). Every ML framework has a different way of handling DNN models, data types, operations involved, and the internal representations stored on disk or memory. There have been initiatives

The meteoric rise of Deep Neural Networks (DNN) has led to the development of various Machine Learning (ML) frameworks (e.g., Tensorflow, PyTorch). Every ML framework has a different way of handling DNN models, data types, operations involved, and the internal representations stored on disk or memory. There have been initiatives such as the Open Neural Network Exchange (ONNX) for a more standardized approach to machine learning for better interoperability between the various popular ML frameworks. Model Serving Platforms (MSP) (e.g., Tensorflow Serving, Clipper) are used for serving DNN models to applications and edge devices. These platforms have gained widespread use for their flexibility in serving DNN models created by various ML frameworks. They also have additional capabilities such as caching, automatic ensembling, and scheduling. However, few of these frameworks focus on optimizing the storage of these DNN models, some of which may take up to ∼130GB storage space(“Turing-NLG: A 17-billion-parameter language model by Microsoft” 2020). These MSPs leave it to the ML frameworks for optimizing the DNN model with various model compression techniques, such as quantization and pruning. This thesis investigates the viability of automatic cross-model compression using traditional deduplication techniques and storage optimizations. Scenarios are identified where different DNN models have shareable model weight parameters. “Chunking” a model into smaller pieces is explored as an approach for deduplication. This thesis also proposes a design for storage in a Relational Database Management System (RDBMS) that allows for automatic cross-model deduplication.
ContributorsDas, Amitabh (Author) / Zou, Jia (Thesis advisor) / Zhao, Ming (Thesis advisor) / Yang, Yingzhen (Committee member) / Arizona State University (Publisher)
Created2021