Matching Items (109)
Filtering by

Clear all filters

149709-Thumbnail Image.png
Description
The price based marketplace has dominated the construction industry. The majority of owners use price based practices of management (expectation and decision making, control, direction, and inspection.) The price based/management and control paradigm has not worked. Clients have now been moving toward the best value environment (hire

The price based marketplace has dominated the construction industry. The majority of owners use price based practices of management (expectation and decision making, control, direction, and inspection.) The price based/management and control paradigm has not worked. Clients have now been moving toward the best value environment (hire contractors who know what they are doing, who preplan, and manage and minimize risk and deviation.) Owners are trying to move from client direction and control to hiring an expert and allowing them to do the quality control/risk management. The movement of environments changes the paradigm for the contractors from a reactive to a proactive, from a bureaucratic
on-accountable to an accountable position, from a relationship based
on-measuring to a measuring entity, and to a contractor who manages and minimizes the risk that they do not control. Years of price based practices have caused poor quality and low performance in the construction industry. This research identifies what is a best value contractor or vendor, what factors make up a best value vendor, and the methodology to transform a vendor to a best value vendor. It will use deductive logic, a case study to confirm the logic and the proposed methodology.
ContributorsPauli, Michele (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2011
150372-Thumbnail Image.png
Description
As global competition continues to grow more disruptive, organizational change is an ever-present reality that affects companies in all industries at both the operational and strategic level. Organizational change capabilities have become a necessary aspect of existence for organizations in all industries worldwide. Research suggests that more than half of

As global competition continues to grow more disruptive, organizational change is an ever-present reality that affects companies in all industries at both the operational and strategic level. Organizational change capabilities have become a necessary aspect of existence for organizations in all industries worldwide. Research suggests that more than half of all organizational change efforts fail to achieve their original intended results, with some studies quoting failure rates as high as 70 percent. Exasperating this problem is the fact that no single change methodology has been universally accepted. This thesis examines two aspect of organizational change: the implementation of tactical and strategic initiatives, primarily focusing on successful tactical implementation techniques. This research proposed that tactical issues typically dominate the focus of change agents and recipients alike, often to the detriment of strategic level initiatives that are vital to the overall value and success of the organizational change effort. The Delphi method was employed to develop a tool to facilitate the initial implementation of organizational change such that tactical barriers were minimized and available resources for strategic initiatives were maximized. Feedback from two expert groups of change agents and change facilitators was solicited to develop the tool and evaluate its impact. Preliminary pilot testing of the tool confirmed the proposal and successfully served to minimize tactical barriers to organizational change.
ContributorsLines, Brian (Author) / Sullivan, Kenneth T. (Thesis advisor) / Badger, William (Committee member) / Kashiwagi, Dean (Committee member) / Arizona State University (Publisher)
Created2011
150250-Thumbnail Image.png
Description
Immunosignaturing is a new immunodiagnostic technology that uses random-sequence peptide microarrays to profile the humoral immune response. Though the peptides have little sequence homology to any known protein, binding of serum antibodies may be detected, and the pattern correlated to disease states. The aim of my dissertation is to analyze

Immunosignaturing is a new immunodiagnostic technology that uses random-sequence peptide microarrays to profile the humoral immune response. Though the peptides have little sequence homology to any known protein, binding of serum antibodies may be detected, and the pattern correlated to disease states. The aim of my dissertation is to analyze the factors affecting the binding patterns using monoclonal antibodies and determine how much information may be extracted from the sequences. Specifically, I examined the effects of antibody concentration, competition, peptide density, and antibody valence. Peptide binding could be detected at the low concentrations relevant to immunosignaturing, and a monoclonal's signature could even be detected in the presences of 100 fold excess naive IgG. I also found that peptide density was important, but this effect was not due to bivalent binding. Next, I examined in more detail how a polyreactive antibody binds to the random sequence peptides compared to protein sequence derived peptides, and found that it bound to many peptides from both sets, but with low apparent affinity. An in depth look at how the peptide physicochemical properties and sequence complexity revealed that there were some correlations with properties, but they were generally small and varied greatly between antibodies. However, on a limited diversity but larger peptide library, I found that sequence complexity was important for antibody binding. The redundancy on that library did enable the identification of specific sub-sequences recognized by an antibody. The current immunosignaturing platform has little repetition of sub-sequences, so I evaluated several methods to infer antibody epitopes. I found two methods that had modest prediction accuracy, and I developed a software application called GuiTope to facilitate the epitope prediction analysis. None of the methods had sufficient accuracy to identify an unknown antigen from a database. In conclusion, the characteristics of the immunosignaturing platform observed through monoclonal antibody experiments demonstrate its promise as a new diagnostic technology. However, a major limitation is the difficulty in connecting the signature back to the original antigen, though larger peptide libraries could facilitate these predictions.
ContributorsHalperin, Rebecca (Author) / Johnston, Stephen A. (Thesis advisor) / Bordner, Andrew (Committee member) / Taylor, Thomas (Committee member) / Stafford, Phillip (Committee member) / Arizona State University (Publisher)
Created2011
150288-Thumbnail Image.png
Description
In an effort to begin validating the large number of discovered candidate biomarkers, proteomics is beginning to shift from shotgun proteomic experiments towards targeted proteomic approaches that provide solutions to automation and economic concerns. Such approaches to validate biomarkers necessitate the mass spectrometric analysis of hundreds to thousands of human

In an effort to begin validating the large number of discovered candidate biomarkers, proteomics is beginning to shift from shotgun proteomic experiments towards targeted proteomic approaches that provide solutions to automation and economic concerns. Such approaches to validate biomarkers necessitate the mass spectrometric analysis of hundreds to thousands of human samples. As this takes place, a serendipitous opportunity has become evident. By the virtue that as one narrows the focus towards "single" protein targets (instead of entire proteomes) using pan-antibody-based enrichment techniques, a discovery science has emerged, so to speak. This is due to the largely unknown context in which "single" proteins exist in blood (i.e. polymorphisms, transcript variants, and posttranslational modifications) and hence, targeted proteomics has applications for established biomarkers. Furthermore, besides protein heterogeneity accounting for interferences with conventional immunometric platforms, it is becoming evident that this formerly hidden dimension of structural information also contains rich-pathobiological information. Consequently, targeted proteomics studies that aim to ascertain a protein's genuine presentation within disease- stratified populations and serve as a stepping-stone within a biomarker translational pipeline are of clinical interest. Roughly 128 million Americans are pre-diabetic, diabetic, and/or have kidney disease and public and private spending for treating these diseases is in the hundreds of billions of dollars. In an effort to create new solutions for the early detection and management of these conditions, described herein is the design, development, and translation of mass spectrometric immunoassays targeted towards diabetes and kidney disease. Population proteomics experiments were performed for the following clinically relevant proteins: insulin, C-peptide, RANTES, and parathyroid hormone. At least thirty-eight protein isoforms were detected. Besides the numerous disease correlations confronted within the disease-stratified cohorts, certain isoforms also appeared to be causally related to the underlying pathophysiology and/or have therapeutic implications. Technical advancements include multiplexed isoform quantification as well a "dual- extraction" methodology for eliminating non-specific proteins while simultaneously validating isoforms. Industrial efforts towards widespread clinical adoption are also described. Consequently, this work lays a foundation for the translation of mass spectrometric immunoassays into the clinical arena and simultaneously presents the most recent advancements concerning the mass spectrometric immunoassay approach.
ContributorsOran, Paul (Author) / Nelson, Randall (Thesis advisor) / Hayes, Mark (Thesis advisor) / Ros, Alexandra (Committee member) / Williams, Peter (Committee member) / Arizona State University (Publisher)
Created2011
150133-Thumbnail Image.png
Description
ABSTRACT Facility managers have an important job in today's competitive business world by caring for the backbone of the corporation's capital. Maintaining assets and the support efforts cause facility managers to fight an uphill battle to prove the worth of their organizations. This thesis will discuss the important and flexible

ABSTRACT Facility managers have an important job in today's competitive business world by caring for the backbone of the corporation's capital. Maintaining assets and the support efforts cause facility managers to fight an uphill battle to prove the worth of their organizations. This thesis will discuss the important and flexible use of measurement and leadership reports and the benefits of justifying the work required to maintain or upgrade a facility. The task is streamlined by invoking accountability to subject experts. The facility manager must trust in the ability of his or her work force to get the job done. However, with accountability comes increased risk. Even though accountability may not alleviate total control or cease reactionary actions, facility managers can develop key leadership based reports to reassign accountability and measure subject matter experts while simultaneously reducing reactionary actions leading to increased cost. Identifying and reassigning risk that are not controlled to subject matter experts is imperative for effective facility management leadership and allows facility managers to create an accurate and solid facility management plan, supports the organization's succession plan, and allows the organization to focus on key competencies.
ContributorsTellefsen, Thor (Author) / Sullivan, Kenneth (Thesis advisor) / Kashiwagi, Dean (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2011
152185-Thumbnail Image.png
Description
Over the past couple of decades, quality has been an area of increased focus. Multiple models and approaches have been proposed to measure the quality in the construction industry. This paper focuses on determining the quality of one of the types of roofing systems used in the construction industry, i.e.

Over the past couple of decades, quality has been an area of increased focus. Multiple models and approaches have been proposed to measure the quality in the construction industry. This paper focuses on determining the quality of one of the types of roofing systems used in the construction industry, i.e. Sprayed Polyurethane Foam Roofs (SPF roofs). Thirty seven urethane coated SPF roofs that were installed in 2005 / 2006 were visually inspected to measure the percentage of blisters and repairs three times over a period of 4 year, 6 year and 7 year marks. A repairing criteria was established after a 6 year mark based on the data that were reported to contractors as vulnerable roofs. Furthermore, the relation between four possible contributing time of installation factors i.e. contractor, demographics, season, and difficulty (number of penetrations and size of the roof in square feet) that could affect the quality of the roof was determined. Demographics and difficulty did not affect the quality of the roofs whereas the contractor and the season when the roof was installed did affect the quality of the roofs.
ContributorsGajjar, Dhaval (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2013
151436-Thumbnail Image.png
Description
Signal processing techniques have been used extensively in many engineering problems and in recent years its application has extended to non-traditional research fields such as biological systems. Many of these applications require extraction of a signal or parameter of interest from degraded measurements. One such application is mass spectrometry immunoassay

Signal processing techniques have been used extensively in many engineering problems and in recent years its application has extended to non-traditional research fields such as biological systems. Many of these applications require extraction of a signal or parameter of interest from degraded measurements. One such application is mass spectrometry immunoassay (MSIA) which has been one of the primary methods of biomarker discovery techniques. MSIA analyzes protein molecules as potential biomarkers using time of flight mass spectrometry (TOF-MS). Peak detection in TOF-MS is important for biomarker analysis and many other MS related application. Though many peak detection algorithms exist, most of them are based on heuristics models. One of the ways of detecting signal peaks is by deploying stochastic models of the signal and noise observations. Likelihood ratio test (LRT) detector, based on the Neyman-Pearson (NP) lemma, is an uniformly most powerful test to decision making in the form of a hypothesis test. The primary goal of this dissertation is to develop signal and noise models for the electrospray ionization (ESI) TOF-MS data. A new method is proposed for developing the signal model by employing first principles calculations based on device physics and molecular properties. The noise model is developed by analyzing MS data from careful experiments in the ESI mass spectrometer. A non-flat baseline in MS data is common. The reasons behind the formation of this baseline has not been fully comprehended. A new signal model explaining the presence of baseline is proposed, though detailed experiments are needed to further substantiate the model assumptions. Signal detection schemes based on these signal and noise models are proposed. A maximum likelihood (ML) method is introduced for estimating the signal peak amplitudes. The performance of the detection methods and ML estimation are evaluated with Monte Carlo simulation which shows promising results. An application of these methods is proposed for fractional abundance calculation for biomarker analysis, which is mathematically robust and fundamentally different than the current algorithms. Biomarker panels for type 2 diabetes and cardiovascular disease are analyzed using existing MS analysis algorithms. Finally, a support vector machine based multi-classification algorithm is developed for evaluating the biomarkers' effectiveness in discriminating type 2 diabetes and cardiovascular diseases and is shown to perform better than a linear discriminant analysis based classifier.
ContributorsBuddi, Sai (Author) / Taylor, Thomas (Thesis advisor) / Cochran, Douglas (Thesis advisor) / Nelson, Randall (Committee member) / Duman, Tolga (Committee member) / Arizona State University (Publisher)
Created2012
150929-Thumbnail Image.png
Description
This thesis examines the application of statistical signal processing approaches to data arising from surveys intended to measure psychological and sociological phenomena underpinning human social dynamics. The use of signal processing methods for analysis of signals arising from measurement of social, biological, and other non-traditional phenomena has been an important

This thesis examines the application of statistical signal processing approaches to data arising from surveys intended to measure psychological and sociological phenomena underpinning human social dynamics. The use of signal processing methods for analysis of signals arising from measurement of social, biological, and other non-traditional phenomena has been an important and growing area of signal processing research over the past decade. Here, we explore the application of statistical modeling and signal processing concepts to data obtained from the Global Group Relations Project, specifically to understand and quantify the effects and interactions of social psychological factors related to intergroup conflicts. We use Bayesian networks to specify prospective models of conditional dependence. Bayesian networks are determined between social psychological factors and conflict variables, and modeled by directed acyclic graphs, while the significant interactions are modeled as conditional probabilities. Since the data are sparse and multi-dimensional, we regress Gaussian mixture models (GMMs) against the data to estimate the conditional probabilities of interest. The parameters of GMMs are estimated using the expectation-maximization (EM) algorithm. However, the EM algorithm may suffer from over-fitting problem due to the high dimensionality and limited observations entailed in this data set. Therefore, the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) are used for GMM order estimation. To assist intuitive understanding of the interactions of social variables and the intergroup conflicts, we introduce a color-based visualization scheme. In this scheme, the intensities of colors are proportional to the conditional probabilities observed.
ContributorsLiu, Hui (Author) / Taylor, Thomas (Thesis advisor) / Cochran, Douglas (Thesis advisor) / Zhang, Junshan (Committee member) / Arizona State University (Publisher)
Created2012
150439-Thumbnail Image.png
Description
This dissertation describes a novel, low cost strategy of using particle streak (track) images for accurate micro-channel velocity field mapping. It is shown that 2-dimensional, 2-component fields can be efficiently obtained using the spatial variation of particle track lengths in micro-channels. The velocity field is a critical performance feature of

This dissertation describes a novel, low cost strategy of using particle streak (track) images for accurate micro-channel velocity field mapping. It is shown that 2-dimensional, 2-component fields can be efficiently obtained using the spatial variation of particle track lengths in micro-channels. The velocity field is a critical performance feature of many microfluidic devices. Since it is often the case that un-modeled micro-scale physics frustrates principled design methodologies, particle based velocity field estimation is an essential design and validation tool. Current technologies that achieve this goal use particle constellation correlation strategies and rely heavily on costly, high-speed imaging hardware. The proposed image/ video processing based method achieves comparable accuracy for fraction of the cost. In the context of micro-channel velocimetry, the usability of particle streaks has been poorly studied so far. Their use has remained restricted mostly to bulk flow measurements and occasional ad-hoc uses in microfluidics. A second look at the usability of particle streak lengths in this work reveals that they can be efficiently used, after approximately 15 years from their first use for micro-channel velocimetry. Particle tracks in steady, smooth microfluidic flows is mathematically modeled and a framework for using experimentally observed particle track lengths for local velocity field estimation is introduced here, followed by algorithm implementation and quantitative verification. Further, experimental considerations and image processing techniques that can facilitate the proposed methods are also discussed in this dissertation. Unavailability of benchmarked particle track image data motivated the implementation of a simulation framework with the capability to generate exposure time controlled particle track image sequence for velocity vector fields. This dissertation also describes this work and shows that arbitrary velocity fields designed in computational fluid dynamics software tools can be used to obtain such images. Apart from aiding gold-standard data generation, such images would find use for quick microfluidic flow field visualization and help improve device designs.
ContributorsMahanti, Prasun (Author) / Cochran, Douglas (Thesis advisor) / Taylor, Thomas (Thesis advisor) / Hayes, Mark (Committee member) / Zhang, Junshan (Committee member) / Arizona State University (Publisher)
Created2011
150449-Thumbnail Image.png
Description
Current information on successful leadership and management practices is contradictory and inconsistent, which makes difficult to understand what successful business practices are and what are not. The purpose of this study is to identify a simple process that quickly and logically identifies consistent and inconsistent leadership and management criteria. The

Current information on successful leadership and management practices is contradictory and inconsistent, which makes difficult to understand what successful business practices are and what are not. The purpose of this study is to identify a simple process that quickly and logically identifies consistent and inconsistent leadership and management criteria. The hypothesis proposed is that Information Measurement Theory (IMT) along with the Kashiwagi Solution Model (KSM) is a methodology than can differentiate between accurate and inaccurate principles the initial part of the study about authors in these areas show how information is conflictive, and also served to establish an initial baseline of recommended practices aligned with IMT. The one author that excels in comparison to the rest suits the "Initial Baseline Matrix from Deming" which composes the first model. The second model is denominated the "Full Extended KSM-Matrix" composed of all the LS characteristics found among all authors and IMT. Both models were tested-out for accuracy. The second part of the study was directed to evaluate the perception of individuals on these principles. Two different groups were evaluated, one group of people that had prior training and knowledge of IMT; another group of people without any knowledge of IMT. The results of the survey showed more confusion in the group of people without knowledge to IMT and improved consistency and less variation in the group of people with knowledge in IMT. The third part of the study, the analysis of case studies of success and failure, identified principles as contributors, and categorized them into LS/type "A" characteristics and RS/type "C" characteristics, by applying the KSM. The results validated the initial proposal and led to the conclusion that practices that fall into the LS side of the KSM will lead to success, while practices that fall into the RS of the KSM will lead to failure. The comparison and testing of both models indicated a dominant support of the IMT concepts as contributors to success; while the KSM model has a higher accuracy of prediction.
ContributorsReynolds, Harry (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2011