Matching Items (38)
Filtering by

Clear all filters

152768-Thumbnail Image.png
Description
In a healthcare setting, the Sterile Processing Department (SPD) provides ancillary services to the Operating Room (OR), Emergency Room, Labor & Delivery, and off-site clinics. SPD's function is to reprocess reusable surgical instruments and return them to their home departments. The management of surgical instruments and medical devices can impact

In a healthcare setting, the Sterile Processing Department (SPD) provides ancillary services to the Operating Room (OR), Emergency Room, Labor & Delivery, and off-site clinics. SPD's function is to reprocess reusable surgical instruments and return them to their home departments. The management of surgical instruments and medical devices can impact patient safety and hospital revenue. Any time instrumentation or devices are not available or are not fit for use, patient safety and revenue can be negatively impacted. One step of the instrument reprocessing cycle is sterilization. Steam sterilization is the sterilization method used for the majority of surgical instruments and is preferred to immediate use steam sterilization (IUSS) because terminally sterilized items can be stored until needed. IUSS Items must be used promptly and cannot be stored for later use. IUSS is intended for emergency situations and not as regular course of action. Unfortunately, IUSS is used to compensate for inadequate inventory levels, scheduling conflicts, and miscommunications. If IUSS is viewed as an adverse event, then monitoring IUSS incidences can help healthcare organizations meet patient safety goals and financial goals along with aiding in process improvement efforts. This work recommends statistical process control methods to IUSS incidents and illustrates the use of control charts for IUSS occurrences through a case study and analysis of the control charts for data from a health care provider. Furthermore, this work considers the application of data mining methods to IUSS occurrences and presents a representative example of data mining to the IUSS occurrences. This extends the application of statistical process control and data mining in healthcare applications.
ContributorsWeart, Gail (Author) / Runger, George C. (Thesis advisor) / Li, Jing (Committee member) / Shunk, Dan (Committee member) / Arizona State University (Publisher)
Created2014
150466-Thumbnail Image.png
Description
The ever-changing economic landscape has forced many companies to re-examine their supply chains. Global resourcing and outsourcing of processes has been a strategy many organizations have adopted to reduce cost and to increase their global footprint. This has, however, resulted in increased process complexity and reduced customer satisfaction. In order

The ever-changing economic landscape has forced many companies to re-examine their supply chains. Global resourcing and outsourcing of processes has been a strategy many organizations have adopted to reduce cost and to increase their global footprint. This has, however, resulted in increased process complexity and reduced customer satisfaction. In order to meet and exceed customer expectations, many companies are forced to improve quality and on-time delivery, and have looked towards Lean Six Sigma as an approach to enable process improvement. The Lean Six Sigma literature is rich in deployment strategies; however, there is a general lack of a mathematical approach to deploy Lean Six Sigma in a global enterprise. This includes both project identification and prioritization. The research presented here is two-fold. Firstly, a process characterization framework is presented to evaluate processes based on eight characteristics. An unsupervised learning technique, using clustering algorithms, is then utilized to group processes that are Lean Six Sigma conducive. The approach helps Lean Six Sigma deployment champions to identify key areas within the business to focus a Lean Six Sigma deployment. A case study is presented and 33% of the processes were found to be Lean Six Sigma conducive. Secondly, having identified parts of the business that are lean Six Sigma conducive, the next steps are to formulate and prioritize a portfolio of projects. Very often the deployment champion is faced with the decision of selecting a portfolio of Lean Six Sigma projects that meet multiple objectives which could include: maximizing productivity, customer satisfaction or return on investment, while meeting certain budgetary constraints. A multi-period 0-1 knapsack problem is presented that maximizes the expected net savings of the Lean Six Sigma portfolio over the life cycle of the deployment. Finally, a case study is presented that demonstrates the application of the model in a large multinational company. Traditionally, Lean Six Sigma found its roots in manufacturing. The research presented in this dissertation also emphasizes the applicability of the methodology to the non-manufacturing space. Additionally, a comparison is conducted between manufacturing and non-manufacturing processes to highlight the challenges in deploying the methodology in both spaces.
ContributorsDuarte, Brett Marc (Author) / Fowler, John W (Thesis advisor) / Montgomery, Douglas C. (Thesis advisor) / Shunk, Dan (Committee member) / Borror, Connie (Committee member) / Konopka, John (Committee member) / Arizona State University (Publisher)
Created2011
151008-Thumbnail Image.png
Description
Buildings (approximately half commercial and half residential) consume over 70% of the electricity among all the consumption units in the United States. Buildings are also responsible for approximately 40% of CO2 emissions, which is more than any other industry sectors. As a result, the initiative smart building which aims to

Buildings (approximately half commercial and half residential) consume over 70% of the electricity among all the consumption units in the United States. Buildings are also responsible for approximately 40% of CO2 emissions, which is more than any other industry sectors. As a result, the initiative smart building which aims to not only manage electrical consumption in an efficient way but also reduce the damaging effect of greenhouse gases on the environment has been launched. Another important technology being promoted by government agencies is the smart grid which manages energy usage across a wide range of buildings in an effort to reduce cost and increase reliability and transparency. As a great amount of efforts have been devoted to these two initiatives by either exploring the smart grid designs or developing technologies for smart buildings, the research studying how the smart buildings and smart grid coordinate thus more efficiently use the energy is currently lacking. In this dissertation, a "system-of-system" approach is employed to develop an integrated building model which consists a number of buildings (building cluster) interacting with smart grid. The buildings can function as both energy consumption unit as well as energy generation/storage unit. Memetic Algorithm (MA) and Particle Swarm Optimization (PSO) based decision framework are developed for building operation decisions. In addition, Particle Filter (PF) is explored as a mean for fusing online sensor and meter data so adaptive decision could be made in responding to dynamic environment. The dissertation is divided into three inter-connected research components. First, an integrated building energy model including building consumption, storage, generation sub-systems for the building cluster is developed. Then a bi-level Memetic Algorithm (MA) based decentralized decision framework is developed to identify the Pareto optimal operation strategies for the building cluster. The Pareto solutions not only enable multiple dimensional tradeoff analysis, but also provide valuable insight for determining pricing mechanisms and power grid capacity. Secondly, a multi-objective PSO based decision framework is developed to reduce the computational effort of the MA based decision framework without scarifying accuracy. With the improved performance, the decision time scale could be refined to make it capable for hourly operation decisions. Finally, by integrating the multi-objective PSO based decision framework with PF, an adaptive framework is developed for adaptive operation decisions for smart building cluster. The adaptive framework not only enables me to develop a high fidelity decision model but also enables the building cluster to respond to the dynamics and uncertainties inherent in the system.
ContributorsHu, Mengqi (Author) / Wu, Teresa (Thesis advisor) / Weir, Jeffery (Thesis advisor) / Wen, Jin (Committee member) / Fowler, John (Committee member) / Shunk, Dan (Committee member) / Arizona State University (Publisher)
Created2012
151203-Thumbnail Image.png
Description
This dissertation presents methods for the evaluation of ocular surface protection during natural blink function. The evaluation of ocular surface protection is especially important in the diagnosis of dry eye and the evaluation of dry eye severity in clinical trials. Dry eye is a highly prevalent disease affecting vast numbers

This dissertation presents methods for the evaluation of ocular surface protection during natural blink function. The evaluation of ocular surface protection is especially important in the diagnosis of dry eye and the evaluation of dry eye severity in clinical trials. Dry eye is a highly prevalent disease affecting vast numbers (between 11% and 22%) of an aging population. There is only one approved therapy with limited efficacy, which results in a huge unmet need. The reason so few drugs have reached approval is a lack of a recognized therapeutic pathway with reproducible endpoints. While the interplay between blink function and ocular surface protection has long been recognized, all currently used evaluation techniques have addressed blink function in isolation from tear film stability, the gold standard of which is Tear Film Break-Up Time (TFBUT). In the first part of this research a manual technique of calculating ocular surface protection during natural blink function through the use of video analysis is developed and evaluated for it's ability to differentiate between dry eye and normal subjects, the results are compared with that of TFBUT. In the second part of this research the technique is improved in precision and automated through the use of video analysis algorithms. This software, called the OPI 2.0 System, is evaluated for accuracy and precision, and comparisons are made between the OPI 2.0 System and other currently recognized dry eye diagnostic techniques (e.g. TFBUT). In the third part of this research the OPI 2.0 System is deployed for use in the evaluation of subjects before, immediately after and 30 minutes after exposure to a controlled adverse environment (CAE), once again the results are compared and contrasted against commonly used dry eye endpoints. The results demonstrate that the evaluation of ocular surface protection using the OPI 2.0 System offers superior accuracy to the current standard, TFBUT.
ContributorsAbelson, Richard (Author) / Montgomery, Douglas C. (Thesis advisor) / Borror, Connie (Committee member) / Shunk, Dan (Committee member) / Pan, Rong (Committee member) / Arizona State University (Publisher)
Created2012
Description
The importance of efficient design and development teams in in 21st century is evident after the compressive literate review was performed to digest various aspects of benefits and foundation of teamwork. Although teamwork may have variety of applications in many different industries, the new emerging biomedical engineering is growing significantly

The importance of efficient design and development teams in in 21st century is evident after the compressive literate review was performed to digest various aspects of benefits and foundation of teamwork. Although teamwork may have variety of applications in many different industries, the new emerging biomedical engineering is growing significantly using principles of teamwork. Studying attributes and mechanism of creating successful biomedical engineering teams may even contribute more to the fast paste growth of this industry. In comprehensive literate review performed, general importance of teamwork was studied. Also specific hard and soft attributes which may contribute to teamwork was studied. Currently, there are number of general assessment tools which assists managements in industry and academia to systematically bring qualified people together to flourish their talents and skills as members of a biomedical engineering teams. These assessment tools, although are useful, but are not comprehensive, incorporating literature review attributes, and also doesn't not contain student perspective who have experience as being part of a design and development team. Although there are many scientific researches and papers designated to this matter, but there is no study which purposefully studies development of an assessment tool which is designated to biomedical engineering workforce and is constructed of both literature, current assessment tools, and also student perspective. It is hypothesized that a more comprehensive composite assessment tool that incorporate both soft and hard team attributes from a combined professional and student perspective could be implemented in the development of successful Biomedical Engineering Design and Development teams and subsequently used in 21st century workforce.
ContributorsAfzalian Naini, Nima (Author) / Pizziconi, Vincent (Thesis director) / Ankeny, Casey (Committee member) / Harrington Bioengineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
Description
Abstract
The aim of the research performed was to increase research potential in the field of cell stimulation by developing a method to adhere human neural progenitor cells (hNPC’s) to a sterilized stretchable microelectrode array (SMEA). The two primary objectives of our research were to develop methods of sterilizing the polydimethylsiloxane

Abstract
The aim of the research performed was to increase research potential in the field of cell stimulation by developing a method to adhere human neural progenitor cells (hNPC’s) to a sterilized stretchable microelectrode array (SMEA). The two primary objectives of our research were to develop methods of sterilizing the polydimethylsiloxane (PDMS) substrate being used for the SMEA, and to derive a functional procedure for adhering hNPC’s to the PDMS. The proven method of sterilization was to plasma treat the sample and then soak it in 70% ethanol for one hour. The most successful method for cell adhesion was plasma treating the PDMS, followed by treating the surface of the PDMS with 0.01 mg/mL poly-l-lysine (PLL) and 3 µg/cm2 laminin. The development of these methods was an iterative process; as the methods were tested, any problems found with the method were corrected for the next round of testing until a final method was confirmed. Moving forward, the findings will allow for cell behavior to be researched in a unique fashion to better understand the response of adherent cells to physical stimulation by measuring changes in their electrical activity.
ContributorsBridgers, Carson (Co-author) / Peterson, Mara (Co-author) / Stabenfeldt, Sarah (Thesis director) / Graudejus, Oliver (Committee member) / Harrington Bioengineering Program (Contributor) / School of Human Evolution and Social Change (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136798-Thumbnail Image.png
Description
The purpose of this project was to examine the viability of protein biomarkers in pre-symptomatic detection of lung cancer. Regular screening has been shown to vastly improve patient survival outcome. Lung cancer currently has the highest occurrence and mortality of all cancers and so a means of screening would be

The purpose of this project was to examine the viability of protein biomarkers in pre-symptomatic detection of lung cancer. Regular screening has been shown to vastly improve patient survival outcome. Lung cancer currently has the highest occurrence and mortality of all cancers and so a means of screening would be highly beneficial. In this research, the biomarker neuron-specific enolase (Enolase-2, eno2), a marker of small-cell lung cancer, was detected at varying concentrations using electrochemical impedance spectroscopy in order to develop a mathematical model of predicting protein expression based on a measured impedance value at a determined optimum frequency. The extent of protein expression would indicate the possibility of the patient having small-cell lung cancer. The optimum frequency was found to be 459 Hz, and the mathematical model to determine eno2 concentration based on impedance was found to be y = 40.246x + 719.5 with an R2 value of 0.82237. These results suggest that this approach could provide an option for the development of small-cell lung cancer screening utilizing electrochemical technology.
ContributorsEvans, William Ian (Author) / LaBelle, Jeffrey (Thesis director) / Spano, Mark (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05
136361-Thumbnail Image.png
Description
Determining the characteristics of an object during a grasping task requires a combination of mechanoreceptors in the muscles and fingertips. The width of a person's finger aperture during the grasp may affect the accuracy of how that person determines hardness, as well. These experiments aim to investigate how an individual

Determining the characteristics of an object during a grasping task requires a combination of mechanoreceptors in the muscles and fingertips. The width of a person's finger aperture during the grasp may affect the accuracy of how that person determines hardness, as well. These experiments aim to investigate how an individual perceives hardness amongst a gradient of varying hardness levels. The trend in the responses is assumed to follow a general psychometric function. This will provide information about subjects' abilities to differentiate between two largely different objects, and their tendencies towards guess-chances upon the presentation of two similar objects. After obtaining this data, it is then important to additionally test varying finger apertures in an object-grasping task. This will allow an insight into the effect of aperture on the obtained psychometric function, thus ultimately providing information about tactile and haptic feedback for further application in neuroprosthetic devices. Three separate experiments were performed in order to test the effect of finger aperture on object hardness differentiation. The first experiment tested a one-finger pressing motion among a hardness gradient of ballistic gelatin cubes. Subjects were asked to compare the hardness of one cube to another, which produced the S-curve that accurately portrayed the psychometric function. The second experiment utilized the Phantom haptic device in a similar setup, using the precision grip grasping motion, instead. This showed a more linear curve; the percentage reported harder increased as the hardness of the second presented cube increased, which was attributed to both the experimental setup limitations and the scale of the general hardness gradient. The third experiment then progressed to test the effect of three finger apertures in the same experimental setup. By providing three separate testing scenarios in the precision grip task, the experiment demonstrated that the level of finger aperture has no significant effect on an individual's ability to perceive hardness.
ContributorsMaestas, Gabrielle Elise (Author) / Helms Tillery, Stephen (Thesis director) / Tanner, Justin (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2015-05
133847-Thumbnail Image.png
Description
With an increased demand for more enzyme-sensitive, bioresorbable and more biodegradable polymers, various studies of copolymers have been developed. Polymers are widely used in various applications of biomedical engineering such as in tissue engineering, drug delivery and wound healing. Depending on the conditions in which polymers are used, they are

With an increased demand for more enzyme-sensitive, bioresorbable and more biodegradable polymers, various studies of copolymers have been developed. Polymers are widely used in various applications of biomedical engineering such as in tissue engineering, drug delivery and wound healing. Depending on the conditions in which polymers are used, they are modified to accommodate a specific need. For instance, polymers used in drug delivery are more efficient if they are biodegradable. This ensures that the delivery system does not remain in the body after releasing the drug. It is therefore crucial that the polymer used in the drug system possess biodegradable properties. Such modification can be done in different ways including the use of peptides to make copolymers that will degrade in the presence of enzymes. In this work, we studied the effect of a polypeptide GAPGLL on the polymer NIPAAm and compare with the previously studied Poly(NIPAAm-co-GAPGLF). Both copolymers Poly(NIPAAm-co-GAPGLL) were first synthesized from Poly(NIPAAm-co-NASI) through nucleophilic substitution by the two peptides. The synthesis of these copolymers was confirmed by 1H NMR spectra and through cloud point measurement, the corresponding LCST was determined. Both copolymers were degraded by collagenase enzyme at 25 ° C and their 1H NMR spectra confirmed this process. Both copolymers were cleaved by collagenase, leading to an increase in solubility which yielded a higher LCST compared to before enzyme degradation. Future studies will focus on evaluating other peptides and also using other techniques such as Differential Scanning Microcalorimetry (DSC) to better observe the LCST behavior. Moreover, enzyme kinetics studies is also crucial to evaluate how fast the enzyme degrades each of the copolymers.
ContributorsUwiringiyimana, Mahoro Marie Chantal (Author) / Vernon, Brent (Thesis director) / Nikkhah, Mehdi (Committee member) / Harrington Bioengineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
137487-Thumbnail Image.png
Description
The current Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation of the major tasks and decisions within the DoD acquisition system, identifies several what-if intervention strategies to improve program completion time. However, processes that contribute to the program acquisition completion time were not explicitly identified in the simulation

The current Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation of the major tasks and decisions within the DoD acquisition system, identifies several what-if intervention strategies to improve program completion time. However, processes that contribute to the program acquisition completion time were not explicitly identified in the simulation study. This research seeks to determine the acquisition processes that contribute significantly to total simulated program time in the acquisition system for all programs reaching Milestone C. Specifically, this research examines the effect of increased scope management, technology maturity, and decreased variation and mean process times in post-Design Readiness Review contractor activities by performing additional simulation analyses. Potential policies are formulated from the results to further improve program acquisition completion time.
ContributorsWorger, Danielle Marie (Author) / Wu, Teresa (Thesis director) / Shunk, Dan (Committee member) / Wirthlin, J. Robert (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2013-05