Matching Items (83)
Filtering by

Clear all filters

Description
Filtration for microfluidic sample-collection devices is desirable for sample selection, concentration, preprocessing, and downstream manipulation, but microfabricating the required sub-micrometer filtration structure is an elaborate process. This thesis presents a simple method to fabricate polydimethylsiloxane (PDMS) devices with an integrated membrane filter that will sample, lyse, and extract the DNA

Filtration for microfluidic sample-collection devices is desirable for sample selection, concentration, preprocessing, and downstream manipulation, but microfabricating the required sub-micrometer filtration structure is an elaborate process. This thesis presents a simple method to fabricate polydimethylsiloxane (PDMS) devices with an integrated membrane filter that will sample, lyse, and extract the DNA from microorganisms in aqueous environments. An off-the-shelf membrane filter disc was embedded in a PDMS layer and sequentially bound with other PDMS channel layers. No leakage was observed during filtration. This device was validated by concentrating a large amount of cyanobacterium Synechocystis in simulated sample water with consistent performance across devices. After accumulating sufficient biomass on the filter, a sequential electrochemical lysing process was performed by applying 5VDC across the filter. This device was further evaluated by delivering several samples of differing concentrations of cyanobacterium Synechocystis then quantifying the DNA using real-time PCR. Lastly, an environmental sample was run through the device and the amount of photosynthetic microorganisms present in the water was determined. The major breakthroughs in this design are low energy demand, cheap materials, simple design, straightforward fabrication, and robust performance, together enabling wide-utility of similar chip-based devices for field-deployable operations in environmental micro-biotechnology.
ContributorsLecluse, Aurelie (Author) / Meldrum, Deirdre (Thesis advisor) / Chao, Joseph (Thesis advisor) / Westerhoff, Paul (Committee member) / Arizona State University (Publisher)
Created2011
Description
Single cell phenotypic heterogeneity studies reveal more information about the pathogenesis process than conventional bulk methods. Furthermore, investigation of the individual cellular response mechanism during rapid environmental changes can only be achieved at single cell level. By enabling the study of cellular morphology, a single cell three-dimensional (3D) imaging system

Single cell phenotypic heterogeneity studies reveal more information about the pathogenesis process than conventional bulk methods. Furthermore, investigation of the individual cellular response mechanism during rapid environmental changes can only be achieved at single cell level. By enabling the study of cellular morphology, a single cell three-dimensional (3D) imaging system can be used to diagnose fatal diseases, such as cancer, at an early stage. One proven method, CellCT, accomplishes 3D imaging by rotating a single cell around a fixed axis. However, some existing cell rotating mechanisms require either intricate microfabrication, and some fail to provide a suitable environment for living cells. This thesis develops a microvorterx chamber that allows living cells to be rotated by hydrodynamic alone while facilitating imaging access. In this thesis work, 1) the new chamber design was developed through numerical simulation. Simulations revealed that in order to form a microvortex in the side chamber, the ratio of the chamber opening to the channel width must be smaller than one. After comparing different chamber designs, the trapezoidal side chamber was selected because it demonstrated controllable circulation and met the imaging requirements. Microvortex properties were not sensitive to the chambers with interface angles ranging from 0.32 to 0.64. A similar trend was observed when chamber heights were larger than chamber opening. 2) Micro-particle image velocimetry was used to characterize microvortices and validate simulation results. Agreement between experimentation and simulation confirmed that numerical simulation was an effective method for chamber design. 3) Finally, cell rotation experiments were performed in the trapezoidal side chamber. The experimental results demonstrated cell rotational rates ranging from 12 to 29 rpm for regular cells. With a volumetric flow rate of 0.5 µL/s, an irregular cell rotated at a mean rate of 97 ± 3 rpm. Rotational rates can be changed by altering inlet flow rates.
ContributorsZhang, Wenjie (Author) / Frakes, David (Thesis advisor) / Meldrum, Deirdre (Thesis advisor) / Chao, Shih-hui (Committee member) / Wang, Xiao (Committee member) / Arizona State University (Publisher)
Created2011
Description
Laboratory automation systems have seen a lot of technological advances in recent times. As a result, the software that is written for them are becoming increasingly sophisticated. Existing software architectures and standards are targeted to a wider domain of software development and need to be customized in order to use

Laboratory automation systems have seen a lot of technological advances in recent times. As a result, the software that is written for them are becoming increasingly sophisticated. Existing software architectures and standards are targeted to a wider domain of software development and need to be customized in order to use them for developing software for laboratory automation systems. This thesis proposes an architecture that is based on existing software architectural paradigms and is specifically tailored to developing software for a laboratory automation system. The architecture is based on fairly autonomous software components that can be distributed across multiple computers. The components in the architecture make use of asynchronous communication methodologies that are facilitated by passing messages between one another. The architecture can be used to develop software that is distributed, responsive and thread-safe. The thesis also proposes a framework that has been developed to implement the ideas proposed by the architecture. The framework is used to develop software that is scalable, distributed, responsive and thread-safe. The framework currently has components to control very commonly used laboratory automation devices such as mechanical stages, cameras, and also to do common laboratory automation functionalities such as imaging.
ContributorsKuppuswamy, Venkataramanan (Author) / Meldrum, Deirdre (Thesis advisor) / Collofello, James (Thesis advisor) / Sarjoughian, Hessam S. (Committee member) / Johnson, Roger (Committee member) / Arizona State University (Publisher)
Created2012
151177-Thumbnail Image.png
Description
Single cell analysis has become increasingly important in understanding disease onset, progression, treatment and prognosis, especially when applied to cancer where cellular responses are highly heterogeneous. Through the advent of single cell computerized tomography (Cell-CT), researchers and clinicians now have the ability to obtain high resolution three-dimensional (3D) reconstructions of

Single cell analysis has become increasingly important in understanding disease onset, progression, treatment and prognosis, especially when applied to cancer where cellular responses are highly heterogeneous. Through the advent of single cell computerized tomography (Cell-CT), researchers and clinicians now have the ability to obtain high resolution three-dimensional (3D) reconstructions of single cells. Yet to date, no live-cell compatible version of the technology exists. In this thesis, a microfluidic chip with the ability to rotate live single cells in hydrodynamic microvortices about an axis parallel to the optical focal plane has been demonstrated. The chip utilizes a novel 3D microchamber design arranged beneath a main channel creating flow detachment into the chamber, producing recirculating flow conditions. Single cells are flowed through the main channel, held in the center of the microvortex by an optical trap, and rotated by the forces induced by the recirculating fluid flow. Computational fluid dynamics (CFD) was employed to optimize the geometry of the microchamber. Two methods for the fabrication of the 3D microchamber were devised: anisotropic etching of silicon and backside diffuser photolithography (BDPL). First, the optimization of the silicon etching conditions was demonstrated through design of experiment (DOE). In addition, a non-conventional method of soft-lithography was demonstrated which incorporates the use of two positive molds, one of the main channel and the other of the microchambers, compressed together during replication to produce a single ultra-thin (<200 µm) negative used for device assembly. Second, methods for using thick negative photoresists such as SU-8 with BDPL have been developed which include a new simple and effective method for promoting the adhesion of SU-8 to glass. An assembly method that bonds two individual ultra-thin (<100 µm) replications of the channel and the microfeatures has also been demonstrated. Finally, a pressure driven pumping system with nanoliter per minute flow rate regulation, sub-second response times, and < 3% flow variability has been designed and characterized. The fabrication and assembly of this device is inexpensive and utilizes simple variants of conventional microfluidic fabrication techniques, making it easily accessible to the single cell analysis community.
ContributorsMyers, Jakrey R (Author) / Meldrum, Deirdre (Thesis advisor) / Johnson, Roger (Committee member) / Frakes, David (Committee member) / Arizona State University (Publisher)
Created2012
136202-Thumbnail Image.png
Description
The objective of this research is to determine an approach for automating the learning of the initial lexicon used in translating natural language sentences to their formal knowledge representations based on lambda-calculus expressions. Using a universal knowledge representation and its associated parser, this research attempts to use word alignment techniques

The objective of this research is to determine an approach for automating the learning of the initial lexicon used in translating natural language sentences to their formal knowledge representations based on lambda-calculus expressions. Using a universal knowledge representation and its associated parser, this research attempts to use word alignment techniques to align natural language sentences to the linearized parses of their associated knowledge representations in order to learn the meanings of individual words. The work includes proposing and analyzing an approach that can be used to learn some of the initial lexicon.
ContributorsBaldwin, Amy Lynn (Author) / Baral, Chitta (Thesis director) / Vo, Nguyen (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2015-05
135786-Thumbnail Image.png
Description
The deductive logic and leadership techniques presented in Dr. Dean Kashiwagi's Information Measurement Theory (IMT) and the Kashiwagi Solution Model (KSM) provide the tools to implement positive change within one's life and environment. By altering the way that I perceive the world, I have made progress in self-improvement through action.

The deductive logic and leadership techniques presented in Dr. Dean Kashiwagi's Information Measurement Theory (IMT) and the Kashiwagi Solution Model (KSM) provide the tools to implement positive change within one's life and environment. By altering the way that I perceive the world, I have made progress in self-improvement through action. This project utilizes self-evaluation as a method to learn from dominant information and experience. In establishing that natural laws govern the world, there is no randomness; events and decisions are all cause-and-effect. When seen through this lens, life becomes simpler and manageable. Through my own implementation of IMT and KSM, I live a more productive lifestyle and feel that I have a meaningful plan for my future.
ContributorsRoot, Shawn Michael (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135788-Thumbnail Image.png
Description
The Department of Defense (DoD) acquisition system is a complex system riddled with cost and schedule overruns. These cost and schedule overruns are very serious issues as the acquisition system is responsible for aiding U.S. warfighters. Hence, if the acquisition process is failing that could be a potential threat to

The Department of Defense (DoD) acquisition system is a complex system riddled with cost and schedule overruns. These cost and schedule overruns are very serious issues as the acquisition system is responsible for aiding U.S. warfighters. Hence, if the acquisition process is failing that could be a potential threat to our nation's security. Furthermore, the DoD acquisition system is responsible for proper allocation of billions of taxpayer's dollars and employs many civilians and military personnel. Much research has been done in the past on the acquisition system with little impact or success. One reason for this lack of success in improving the system is the lack of accurate models to test theories. This research is a continuation of the effort on the Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation modeling research on DoD acquisition system. We propose to extend ERAM using agent-based simulation principles due to the many interactions among the subsystems of the acquisition system. We initially identify ten sub models needed to simulate the acquisition system. This research focuses on three sub models related to the budget of acquisition programs. In this thesis, we present the data collection, data analysis, initial implementation, and initial validation needed to facilitate these sub models and lay the groundwork for a full agent-based simulation of the DoD acquisition system.
ContributorsBucknell, Sophia Robin (Author) / Wu, Teresa (Thesis director) / Li, Jing (Committee member) / Colombi, John (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
In 2010, for the first time in human history, more than half of the world's total population lived in cities; this number is expected to increase to 60% or more by 2050. The goal of this research effort is to create a comprehensive model and modelling framework for megacities, middleweight

In 2010, for the first time in human history, more than half of the world's total population lived in cities; this number is expected to increase to 60% or more by 2050. The goal of this research effort is to create a comprehensive model and modelling framework for megacities, middleweight cities, and urban agglomerations, collectively referred to as dense urban areas. The motivation for this project comes from the United States Army's desire for readiness in all operating environments including dense urban areas. Though there is valuable insight in research to support Army operational behaviors, megacities are of unique interest to nearly every societal sector imaginable. A novel application for determining both main effects and interactive effects between factors within a dense urban area is a Design of Experiments- providing insight on factor causations. Regression Modelling can also be employed for analysis of dense urban areas, providing wide ranging insights into correlations between factors and their interactions. Past studies involving megacities concern themselves with general trend of cities and their operation. This study is unique in its efforts to model a singular megacity to enable decision support for military operational planning, as well as potential decision support to city planners to increase the sustainability of these dense urban areas and megacities.
ContributorsMathesen, Logan Michael (Author) / Zenzen, Frances (Thesis director) / Jennings, Cheryl (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135972-Thumbnail Image.png
Description
The Performance Based Studies Research Studies Group (PBSRG) at Arizona State University (ASU) has been studying the cause of increased cost and time in construction and other projects for the last 20 years. Through two longitudinal studies with a group of owners in the state of Minnesota (400 tests over

The Performance Based Studies Research Studies Group (PBSRG) at Arizona State University (ASU) has been studying the cause of increased cost and time in construction and other projects for the last 20 years. Through two longitudinal studies with a group of owners in the state of Minnesota (400 tests over six years) and the US Army Medical Command (400 tests over four years), the client/buyer has been identified as the largest risk and source of project cost and time deviations. This has been confirmed by over 1,500 tests conducted over the past 20 years. The focus of this research effort is to analyze the economic and performance impact of a delivery process of construction called the Job Order Contracting (JOC) process, to evaluate the value (in terms of time, cost, and customer satisfaction) achieved when utilizing JOC over other traditional methods to complete projects. JOC's strength is that it minimizes the need for the owner to manage, direct and control (MDC) through a lengthy traditional process of design, bid, and award of a construction contract. The study identifies the potential economic savings of utilizing JOC. This paper looks at the results of an ongoing study surveying eight different public universities. The results of the research show that in comparison to more traditional models, JOC has large cost savings, and is preferable among most owners who have used multiple delivery systems.
ContributorsLi, Hao (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Industrial, Systems (Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2015-12
136742-Thumbnail Image.png
Description
The Arizona Department of Environmental Quality (ADEQ) experienced a problem with the quality of their services. The agency was expending a large amount of resources, both time and money to control contractors' work with unexpected poor quality work. ADEQ partnered with Dr. Dean Kashiwagi and the Performance Based Studies Research

The Arizona Department of Environmental Quality (ADEQ) experienced a problem with the quality of their services. The agency was expending a large amount of resources, both time and money to control contractors' work with unexpected poor quality work. ADEQ partnered with Dr. Dean Kashiwagi and the Performance Based Studies Research Group (PBSRG) early in 2014 to find a solution to the procurement problems. PBSRG introduced the Performance Information Procurement System (PIPS) and began implementation on four test projects. Three of the projects have moved into the execution phase delivering almost $100K savings in the procurement process alone. The three main causes of the issues were: lack of a system identifying the quality of vendors, management, direction, and control (MDC), and lack of a system to track vendor performance. Best value PIPS is a paradigm shift from the traditional price-based model and has succeeded in mitigating these challenges for the industry, while also validating the PBSRG model.
ContributorsFink, Fabian Josef (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2014-12