Matching Items (526)
149709-Thumbnail Image.png
Description
The price based marketplace has dominated the construction industry. The majority of owners use price based practices of management (expectation and decision making, control, direction, and inspection.) The price based/management and control paradigm has not worked. Clients have now been moving toward the best value environment (hire

The price based marketplace has dominated the construction industry. The majority of owners use price based practices of management (expectation and decision making, control, direction, and inspection.) The price based/management and control paradigm has not worked. Clients have now been moving toward the best value environment (hire contractors who know what they are doing, who preplan, and manage and minimize risk and deviation.) Owners are trying to move from client direction and control to hiring an expert and allowing them to do the quality control/risk management. The movement of environments changes the paradigm for the contractors from a reactive to a proactive, from a bureaucratic
on-accountable to an accountable position, from a relationship based
on-measuring to a measuring entity, and to a contractor who manages and minimizes the risk that they do not control. Years of price based practices have caused poor quality and low performance in the construction industry. This research identifies what is a best value contractor or vendor, what factors make up a best value vendor, and the methodology to transform a vendor to a best value vendor. It will use deductive logic, a case study to confirm the logic and the proposed methodology.
ContributorsPauli, Michele (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2011
150372-Thumbnail Image.png
Description
As global competition continues to grow more disruptive, organizational change is an ever-present reality that affects companies in all industries at both the operational and strategic level. Organizational change capabilities have become a necessary aspect of existence for organizations in all industries worldwide. Research suggests that more than half of

As global competition continues to grow more disruptive, organizational change is an ever-present reality that affects companies in all industries at both the operational and strategic level. Organizational change capabilities have become a necessary aspect of existence for organizations in all industries worldwide. Research suggests that more than half of all organizational change efforts fail to achieve their original intended results, with some studies quoting failure rates as high as 70 percent. Exasperating this problem is the fact that no single change methodology has been universally accepted. This thesis examines two aspect of organizational change: the implementation of tactical and strategic initiatives, primarily focusing on successful tactical implementation techniques. This research proposed that tactical issues typically dominate the focus of change agents and recipients alike, often to the detriment of strategic level initiatives that are vital to the overall value and success of the organizational change effort. The Delphi method was employed to develop a tool to facilitate the initial implementation of organizational change such that tactical barriers were minimized and available resources for strategic initiatives were maximized. Feedback from two expert groups of change agents and change facilitators was solicited to develop the tool and evaluate its impact. Preliminary pilot testing of the tool confirmed the proposal and successfully served to minimize tactical barriers to organizational change.
ContributorsLines, Brian (Author) / Sullivan, Kenneth T. (Thesis advisor) / Badger, William (Committee member) / Kashiwagi, Dean (Committee member) / Arizona State University (Publisher)
Created2011
150228-Thumbnail Image.png
Description
The repression of reproductive competition and the enforcement of altruism are key components to the success of animal societies. Eusocial insects are defined by having a reproductive division of labor, in which reproduction is relegated to one or few individuals while the rest of the group members maintain the colony

The repression of reproductive competition and the enforcement of altruism are key components to the success of animal societies. Eusocial insects are defined by having a reproductive division of labor, in which reproduction is relegated to one or few individuals while the rest of the group members maintain the colony and help raise offspring. However, workers have retained the ability to reproduce in most insect societies. In the social Hymenoptera, due to haplodiploidy, workers can lay unfertilized male destined eggs without mating. Potential conflict between workers and queens can arise over male production, and policing behaviors performed by nestmate workers and queens are a means of repressing worker reproduction. This work describes the means and results of the regulation of worker reproduction in the ant species Aphaenogaster cockerelli. Through manipulative laboratory studies on mature colonies, the lack of egg policing and the presence of physical policing by both workers and queens of this species are described. Through chemical analysis and artificial chemical treatments, the role of cuticular hydrocarbons as indicators of fertility status and the informational basis of policing in this species is demonstrated. An additional queen-specific chemical signal in the Dufour's gland is discovered to be used to direct nestmate aggression towards reproductive competitors. Finally, the level of actual worker-derived males in field colonies is measured. Together, these studies demonstrate the effectiveness of policing behaviors on the suppression of worker reproduction in a social insect species, and provide an example of how punishment and the threat of punishment is a powerful force in maintaining cooperative societies.
ContributorsSmith, Adrian A. (Author) / Liebig, Juergen (Thesis advisor) / Hoelldobler, Bert (Thesis advisor) / Gadau, Juergen (Committee member) / Johnson, Robert A. (Committee member) / Pratt, Stephen (Committee member) / Arizona State University (Publisher)
Created2011
150111-Thumbnail Image.png
Description
Finding the optimal solution to a problem with an enormous search space can be challenging. Unless a combinatorial construction technique is found that also guarantees the optimality of the resulting solution, this could be an infeasible task. If such a technique is unavailable, different heuristic methods are generally used to

Finding the optimal solution to a problem with an enormous search space can be challenging. Unless a combinatorial construction technique is found that also guarantees the optimality of the resulting solution, this could be an infeasible task. If such a technique is unavailable, different heuristic methods are generally used to improve the upper bound on the size of the optimal solution. This dissertation presents an alternative method which can be used to improve a solution to a problem rather than construct a solution from scratch. Necessity analysis, which is the key to this approach, is the process of analyzing the necessity of each element in a solution. The post-optimization algorithm presented here utilizes the result of the necessity analysis to improve the quality of the solution by eliminating unnecessary objects from the solution. While this technique could potentially be applied to different domains, this dissertation focuses on k-restriction problems, where a solution to the problem can be presented as an array. A scalable post-optimization algorithm for covering arrays is described, which starts from a valid solution and performs necessity analysis to iteratively improve the quality of the solution. It is shown that not only can this technique improve upon the previously best known results, it can also be added as a refinement step to any construction technique and in most cases further improvements are expected. The post-optimization algorithm is then modified to accommodate every k-restriction problem; and this generic algorithm can be used as a starting point to create a reasonable sized solution for any such problem. This generic algorithm is then further refined for hash family problems, by adding a conflict graph analysis to the necessity analysis phase. By recoloring the conflict graphs a new degree of flexibility is explored, which can further improve the quality of the solution.
ContributorsNayeri, Peyman (Author) / Colbourn, Charles (Thesis advisor) / Konjevod, Goran (Thesis advisor) / Sen, Arunabha (Committee member) / Stanzione Jr, Daniel (Committee member) / Arizona State University (Publisher)
Created2011
150114-Thumbnail Image.png
Description
Reverse engineering gene regulatory networks (GRNs) is an important problem in the domain of Systems Biology. Learning GRNs is challenging due to the inherent complexity of the real regulatory networks and the heterogeneity of samples in available biomedical data. Real world biological data are commonly collected from broad surveys (profiling

Reverse engineering gene regulatory networks (GRNs) is an important problem in the domain of Systems Biology. Learning GRNs is challenging due to the inherent complexity of the real regulatory networks and the heterogeneity of samples in available biomedical data. Real world biological data are commonly collected from broad surveys (profiling studies) and aggregate highly heterogeneous biological samples. Popular methods to learn GRNs simplistically assume a single universal regulatory network corresponding to available data. They neglect regulatory network adaptation due to change in underlying conditions and cellular phenotype or both. This dissertation presents a novel computational framework to learn common regulatory interactions and networks underlying the different sets of relatively homogeneous samples from real world biological data. The characteristic set of samples/conditions and corresponding regulatory interactions defines the cellular context (context). Context, in this dissertation, represents the deterministic transcriptional activity within the specific cellular regulatory mechanism. The major contributions of this framework include - modeling and learning context specific GRNs; associating enriched samples with contexts to interpret contextual interactions using biological knowledge; pruning extraneous edges from the context-specific GRN to improve the precision of the final GRNs; integrating multisource data to learn inter and intra domain interactions and increase confidence in obtained GRNs; and finally, learning combinatorial conditioning factors from the data to identify regulatory cofactors. The framework, Expattern, was applied to both real world and synthetic data. Interesting insights were obtained into mechanism of action of drugs on analysis of NCI60 drug activity and gene expression data. Application to refractory cancer data and Glioblastoma multiforme yield GRNs that were readily annotated with context-specific phenotypic information. Refractory cancer GRNs also displayed associations between distinct cancers, not observed through only clustering. Performance comparisons on multi-context synthetic data show the framework Expattern performs better than other comparable methods.
ContributorsSen, Ina (Author) / Kim, Seungchan (Thesis advisor) / Baral, Chitta (Committee member) / Bittner, Michael (Committee member) / Konjevod, Goran (Committee member) / Arizona State University (Publisher)
Created2011
150133-Thumbnail Image.png
Description
ABSTRACT Facility managers have an important job in today's competitive business world by caring for the backbone of the corporation's capital. Maintaining assets and the support efforts cause facility managers to fight an uphill battle to prove the worth of their organizations. This thesis will discuss the important and flexible

ABSTRACT Facility managers have an important job in today's competitive business world by caring for the backbone of the corporation's capital. Maintaining assets and the support efforts cause facility managers to fight an uphill battle to prove the worth of their organizations. This thesis will discuss the important and flexible use of measurement and leadership reports and the benefits of justifying the work required to maintain or upgrade a facility. The task is streamlined by invoking accountability to subject experts. The facility manager must trust in the ability of his or her work force to get the job done. However, with accountability comes increased risk. Even though accountability may not alleviate total control or cease reactionary actions, facility managers can develop key leadership based reports to reassign accountability and measure subject matter experts while simultaneously reducing reactionary actions leading to increased cost. Identifying and reassigning risk that are not controlled to subject matter experts is imperative for effective facility management leadership and allows facility managers to create an accurate and solid facility management plan, supports the organization's succession plan, and allows the organization to focus on key competencies.
ContributorsTellefsen, Thor (Author) / Sullivan, Kenneth (Thesis advisor) / Kashiwagi, Dean (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2011
149703-Thumbnail Image.png
Description
This dissertation studies routing in small-world networks such as grids plus long-range edges and real networks. Kleinberg showed that geography-based greedy routing in a grid-based network takes an expected number of steps polylogarithmic in the network size, thus justifying empirical efficiency observed beginning with Milgram. A counterpart for the grid-based

This dissertation studies routing in small-world networks such as grids plus long-range edges and real networks. Kleinberg showed that geography-based greedy routing in a grid-based network takes an expected number of steps polylogarithmic in the network size, thus justifying empirical efficiency observed beginning with Milgram. A counterpart for the grid-based model is provided; it creates all edges deterministically and shows an asymptotically matching upper bound on the route length. The main goal is to improve greedy routing through a decentralized machine learning process. Two considered methods are based on weighted majority and an algorithm of de Farias and Megiddo, both learning from feedback using ensembles of experts. Tests are run on both artificial and real networks, with decentralized spectral graph embedding supplying geometric information for real networks where it is not intrinsically available. An important measure analyzed in this work is overpayment, the difference between the cost of the method and that of the shortest path. Adaptive routing overtakes greedy after about a hundred or fewer searches per node, consistently across different network sizes and types. Learning stabilizes, typically at overpayment of a third to a half of that by greedy. The problem is made more difficult by eliminating the knowledge of neighbors' locations or by introducing uncooperative nodes. Even under these conditions, the learned routes are usually better than the greedy routes. The second part of the dissertation is related to the community structure of unannotated networks. A modularity-based algorithm of Newman is extended to work with overlapping communities (including considerably overlapping communities), where each node locally makes decisions to which potential communities it belongs. To measure quality of a cover of overlapping communities, a notion of a node contribution to modularity is introduced, and subsequently the notion of modularity is extended from partitions to covers. The final part considers a problem of network anonymization, mostly by the means of edge deletion. The point of interest is utility preservation. It is shown that a concentration on the preservation of routing abilities might damage the preservation of community structure, and vice versa.
ContributorsBakun, Oleg (Author) / Konjevod, Goran (Thesis advisor) / Richa, Andrea (Thesis advisor) / Syrotiuk, Violet R. (Committee member) / Czygrinow, Andrzej (Committee member) / Arizona State University (Publisher)
Created2011
151901-Thumbnail Image.png
Description
ABSTRACT 1. Aposematic signals advertise prey distastefulness or metabolic unprofitability to potential predators and have evolved independently in many prey groups over the course of evolutionary history as a means of protection from predation. Most aposematic signals investigated to date exhibit highly chromatic patterning; however, relatives in these toxic groups

ABSTRACT 1. Aposematic signals advertise prey distastefulness or metabolic unprofitability to potential predators and have evolved independently in many prey groups over the course of evolutionary history as a means of protection from predation. Most aposematic signals investigated to date exhibit highly chromatic patterning; however, relatives in these toxic groups with patterns of very low chroma have been largely overlooked. 2. We propose that bright displays with low chroma arose in toxic prey species because they were more effective at deterring predation than were their chromatic counterparts, especially when viewed in relatively low light environments such as forest understories. 3. We analyzed the reflectance and radiance of color patches on the wings of 90 tropical butterfly species that belong to groups with documented toxicity that vary in their habitat preferences to test this prediction: Warning signal chroma and perceived chromaticity are expected to be higher and brightness lower in species that fly in open environments when compared to those that fly in forested environments. 4. Analyses of the reflectance and radiance of warning color patches and predator visual modeling support this prediction. Moreover, phylogenetic tests, which correct for statistical non-independence due to phylogenetic relatedness of test species, also support the hypothesis of an evolutionary correlation between perceived chromaticity of aposematic signals and the flight habits of the butterflies that exhibit these signals.
ContributorsDouglas, Jonathan Marion (Author) / Rutowski, Ronald L (Thesis advisor) / Gadau, Juergen (Committee member) / McGraw, Kevin J. (Committee member) / Arizona State University (Publisher)
Created2013
152185-Thumbnail Image.png
Description
Over the past couple of decades, quality has been an area of increased focus. Multiple models and approaches have been proposed to measure the quality in the construction industry. This paper focuses on determining the quality of one of the types of roofing systems used in the construction industry, i.e.

Over the past couple of decades, quality has been an area of increased focus. Multiple models and approaches have been proposed to measure the quality in the construction industry. This paper focuses on determining the quality of one of the types of roofing systems used in the construction industry, i.e. Sprayed Polyurethane Foam Roofs (SPF roofs). Thirty seven urethane coated SPF roofs that were installed in 2005 / 2006 were visually inspected to measure the percentage of blisters and repairs three times over a period of 4 year, 6 year and 7 year marks. A repairing criteria was established after a 6 year mark based on the data that were reported to contractors as vulnerable roofs. Furthermore, the relation between four possible contributing time of installation factors i.e. contractor, demographics, season, and difficulty (number of penetrations and size of the roof in square feet) that could affect the quality of the roof was determined. Demographics and difficulty did not affect the quality of the roofs whereas the contractor and the season when the roof was installed did affect the quality of the roofs.
ContributorsGajjar, Dhaval (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2013
152172-Thumbnail Image.png
Description
The primary function of the medium access control (MAC) protocol is managing access to a shared communication channel. From the viewpoint of transmitters, the MAC protocol determines each transmitter's persistence, the fraction of time it is permitted to spend transmitting. Schedule-based schemes implement stable persistences, achieving low variation in delay

The primary function of the medium access control (MAC) protocol is managing access to a shared communication channel. From the viewpoint of transmitters, the MAC protocol determines each transmitter's persistence, the fraction of time it is permitted to spend transmitting. Schedule-based schemes implement stable persistences, achieving low variation in delay and throughput, and sometimes bounding maximum delay. However, they adapt slowly, if at all, to changes in the network. Contention-based schemes are agile, adapting quickly to changes in perceived contention, but suffer from short-term unfairness, large variations in packet delay, and poor performance at high load. The perfect MAC protocol, it seems, embodies the strengths of both contention- and schedule-based approaches while avoiding their weaknesses. This thesis culminates in the design of a Variable-Weight and Adaptive Topology Transparent (VWATT) MAC protocol. The design of VWATT first required answers for two questions: (1) If a node is equipped with schedules of different weights, which weight should it employ? (2) How is the node to compute the desired weight in a network lacking centralized control? The first question is answered by the Topology- and Load-Aware (TLA) allocation which defines target persistences that conform to both network topology and traffic load. Simulations show the TLA allocation to outperform IEEE 802.11, improving on the expectation and variation of delay, throughput, and drop rate. The second question is answered in the design of an Adaptive Topology- and Load-Aware Scheduled (ATLAS) MAC that computes the TLA allocation in a decentralized and adaptive manner. Simulation results show that ATLAS converges quickly on the TLA allocation, supporting highly dynamic networks. With these questions answered, a construction based on transversal designs is given for a variable-weight topology transparent schedule that allows nodes to dynamically and independently select weights to accommodate local topology and traffic load. The schedule maintains a guarantee on maximum delay when the maximum neighbourhood size is not too large. The schedule is integrated with the distributed computation of ATLAS to create VWATT. Simulations indicate that VWATT offers the stable performance characteristics of a scheduled MAC while adapting quickly to changes in topology and traffic load.
ContributorsLutz, Jonathan (Author) / Colbourn, Charles J (Thesis advisor) / Syrotiuk, Violet R. (Thesis advisor) / Konjevod, Goran (Committee member) / Lloyd, Errol L. (Committee member) / Arizona State University (Publisher)
Created2013