Matching Items (256)
149709-Thumbnail Image.png
Description
The price based marketplace has dominated the construction industry. The majority of owners use price based practices of management (expectation and decision making, control, direction, and inspection.) The price based/management and control paradigm has not worked. Clients have now been moving toward the best value environment (hire

The price based marketplace has dominated the construction industry. The majority of owners use price based practices of management (expectation and decision making, control, direction, and inspection.) The price based/management and control paradigm has not worked. Clients have now been moving toward the best value environment (hire contractors who know what they are doing, who preplan, and manage and minimize risk and deviation.) Owners are trying to move from client direction and control to hiring an expert and allowing them to do the quality control/risk management. The movement of environments changes the paradigm for the contractors from a reactive to a proactive, from a bureaucratic
on-accountable to an accountable position, from a relationship based
on-measuring to a measuring entity, and to a contractor who manages and minimizes the risk that they do not control. Years of price based practices have caused poor quality and low performance in the construction industry. This research identifies what is a best value contractor or vendor, what factors make up a best value vendor, and the methodology to transform a vendor to a best value vendor. It will use deductive logic, a case study to confirm the logic and the proposed methodology.
ContributorsPauli, Michele (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2011
149754-Thumbnail Image.png
Description
A good production schedule in a semiconductor back-end facility is critical for the on time delivery of customer orders. Compared to the front-end process that is dominated by re-entrant product flows, the back-end process is linear and therefore more suitable for scheduling. However, the production scheduling of the back-end process

A good production schedule in a semiconductor back-end facility is critical for the on time delivery of customer orders. Compared to the front-end process that is dominated by re-entrant product flows, the back-end process is linear and therefore more suitable for scheduling. However, the production scheduling of the back-end process is still very difficult due to the wide product mix, large number of parallel machines, product family related setups, machine-product qualification, and weekly demand consisting of thousands of lots. In this research, a novel mixed-integer-linear-programming (MILP) model is proposed for the batch production scheduling of a semiconductor back-end facility. In the MILP formulation, the manufacturing process is modeled as a flexible flow line with bottleneck stages, unrelated parallel machines, product family related sequence-independent setups, and product-machine qualification considerations. However, this MILP formulation is difficult to solve for real size problem instances. In a semiconductor back-end facility, production scheduling usually needs to be done every day while considering updated demand forecast for a medium term planning horizon. Due to the limitation on the solvable size of the MILP model, a deterministic scheduling system (DSS), consisting of an optimizer and a scheduler, is proposed to provide sub-optimal solutions in a short time for real size problem instances. The optimizer generates a tentative production plan. Then the scheduler sequences each lot on each individual machine according to the tentative production plan and scheduling rules. Customized factory rules and additional resource constraints are included in the DSS, such as preventive maintenance schedule, setup crew availability, and carrier limitations. Small problem instances are randomly generated to compare the performances of the MILP model and the deterministic scheduling system. Then experimental design is applied to understand the behavior of the DSS and identify the best configuration of the DSS under different demand scenarios. Product-machine qualification decisions have long-term and significant impact on production scheduling. A robust product-machine qualification matrix is critical for meeting demand when demand quantity or mix varies. In the second part of this research, a stochastic mixed integer programming model is proposed to balance the tradeoff between current machine qualification costs and future backorder costs with uncertain demand. The L-shaped method and acceleration techniques are proposed to solve the stochastic model. Computational results are provided to compare the performance of different solution methods.
ContributorsFu, Mengying (Author) / Askin, Ronald G. (Thesis advisor) / Zhang, Muhong (Thesis advisor) / Fowler, John W (Committee member) / Pan, Rong (Committee member) / Sen, Arunabha (Committee member) / Arizona State University (Publisher)
Created2011
150372-Thumbnail Image.png
Description
As global competition continues to grow more disruptive, organizational change is an ever-present reality that affects companies in all industries at both the operational and strategic level. Organizational change capabilities have become a necessary aspect of existence for organizations in all industries worldwide. Research suggests that more than half of

As global competition continues to grow more disruptive, organizational change is an ever-present reality that affects companies in all industries at both the operational and strategic level. Organizational change capabilities have become a necessary aspect of existence for organizations in all industries worldwide. Research suggests that more than half of all organizational change efforts fail to achieve their original intended results, with some studies quoting failure rates as high as 70 percent. Exasperating this problem is the fact that no single change methodology has been universally accepted. This thesis examines two aspect of organizational change: the implementation of tactical and strategic initiatives, primarily focusing on successful tactical implementation techniques. This research proposed that tactical issues typically dominate the focus of change agents and recipients alike, often to the detriment of strategic level initiatives that are vital to the overall value and success of the organizational change effort. The Delphi method was employed to develop a tool to facilitate the initial implementation of organizational change such that tactical barriers were minimized and available resources for strategic initiatives were maximized. Feedback from two expert groups of change agents and change facilitators was solicited to develop the tool and evaluate its impact. Preliminary pilot testing of the tool confirmed the proposal and successfully served to minimize tactical barriers to organizational change.
ContributorsLines, Brian (Author) / Sullivan, Kenneth T. (Thesis advisor) / Badger, William (Committee member) / Kashiwagi, Dean (Committee member) / Arizona State University (Publisher)
Created2011
149723-Thumbnail Image.png
Description
This dissertation transforms a set of system complexity reduction problems to feature selection problems. Three systems are considered: classification based on association rules, network structure learning, and time series classification. Furthermore, two variable importance measures are proposed to reduce the feature selection bias in tree models. Associative classifiers can achieve

This dissertation transforms a set of system complexity reduction problems to feature selection problems. Three systems are considered: classification based on association rules, network structure learning, and time series classification. Furthermore, two variable importance measures are proposed to reduce the feature selection bias in tree models. Associative classifiers can achieve high accuracy, but the combination of many rules is difficult to interpret. Rule condition subset selection (RCSS) methods for associative classification are considered. RCSS aims to prune the rule conditions into a subset via feature selection. The subset then can be summarized into rule-based classifiers. Experiments show that classifiers after RCSS can substantially improve the classification interpretability without loss of accuracy. An ensemble feature selection method is proposed to learn Markov blankets for either discrete or continuous networks (without linear, Gaussian assumptions). The method is compared to a Bayesian local structure learning algorithm and to alternative feature selection methods in the causal structure learning problem. Feature selection is also used to enhance the interpretability of time series classification. Existing time series classification algorithms (such as nearest-neighbor with dynamic time warping measures) are accurate but difficult to interpret. This research leverages the time-ordering of the data to extract features, and generates an effective and efficient classifier referred to as a time series forest (TSF). The computational complexity of TSF is only linear in the length of time series, and interpretable features can be extracted. These features can be further reduced, and summarized for even better interpretability. Lastly, two variable importance measures are proposed to reduce the feature selection bias in tree-based ensemble models. It is well known that bias can occur when predictor attributes have different numbers of values. Two methods are proposed to solve the bias problem. One uses an out-of-bag sampling method called OOBForest, and the other, based on the new concept of a partial permutation test, is called a pForest. Experimental results show the existing methods are not always reliable for multi-valued predictors, while the proposed methods have advantages.
ContributorsDeng, Houtao (Author) / Runger, George C. (Thesis advisor) / Lohr, Sharon L (Committee member) / Pan, Rong (Committee member) / Zhang, Muhong (Committee member) / Arizona State University (Publisher)
Created2011
149658-Thumbnail Image.png
Description
Hydropower generation is one of the clean renewable energies which has received great attention in the power industry. Hydropower has been the leading source of renewable energy. It provides more than 86% of all electricity generated by renewable sources worldwide. Generally, the life span of a hydropower plant is considered

Hydropower generation is one of the clean renewable energies which has received great attention in the power industry. Hydropower has been the leading source of renewable energy. It provides more than 86% of all electricity generated by renewable sources worldwide. Generally, the life span of a hydropower plant is considered as 30 to 50 years. Power plants over 30 years old usually conduct a feasibility study of rehabilitation on their entire facilities including infrastructure. By age 35, the forced outage rate increases by 10 percentage points compared to the previous year. Much longer outages occur in power plants older than 20 years. Consequently, the forced outage rate increases exponentially due to these longer outages. Although these long forced outages are not frequent, their impact is immense. If reasonable timing of rehabilitation is missed, an abrupt long-term outage could occur and additional unnecessary repairs and inefficiencies would follow. On the contrary, too early replacement might cause the waste of revenue. The hydropower plants of Korea Water Resources Corporation (hereafter K-water) are utilized for this study. Twenty-four K-water generators comprise the population for quantifying the reliability of each equipment. A facility in a hydropower plant is a repairable system because most failures can be fixed without replacing the entire facility. The fault data of each power plant are collected, within which only forced outage faults are considered as raw data for reliability analyses. The mean cumulative repair functions (MCF) of each facility are determined with the failure data tables, using Nelson's graph method. The power law model, a popular model for a repairable system, can also be obtained to represent representative equipment and system availability. The criterion-based analysis of HydroAmp is used to provide more accurate reliability of each power plant. Two case studies are presented to enhance the understanding of the availability of each power plant and represent economic evaluations for modernization. Also, equipment in a hydropower plant is categorized into two groups based on their reliability for determining modernization timing and their suitable replacement periods are obtained using simulation.
ContributorsKwon, Ogeuk (Author) / Holbert, Keith E. (Thesis advisor) / Heydt, Gerald T (Committee member) / Pan, Rong (Committee member) / Arizona State University (Publisher)
Created2011
150133-Thumbnail Image.png
Description
ABSTRACT Facility managers have an important job in today's competitive business world by caring for the backbone of the corporation's capital. Maintaining assets and the support efforts cause facility managers to fight an uphill battle to prove the worth of their organizations. This thesis will discuss the important and flexible

ABSTRACT Facility managers have an important job in today's competitive business world by caring for the backbone of the corporation's capital. Maintaining assets and the support efforts cause facility managers to fight an uphill battle to prove the worth of their organizations. This thesis will discuss the important and flexible use of measurement and leadership reports and the benefits of justifying the work required to maintain or upgrade a facility. The task is streamlined by invoking accountability to subject experts. The facility manager must trust in the ability of his or her work force to get the job done. However, with accountability comes increased risk. Even though accountability may not alleviate total control or cease reactionary actions, facility managers can develop key leadership based reports to reassign accountability and measure subject matter experts while simultaneously reducing reactionary actions leading to increased cost. Identifying and reassigning risk that are not controlled to subject matter experts is imperative for effective facility management leadership and allows facility managers to create an accurate and solid facility management plan, supports the organization's succession plan, and allows the organization to focus on key competencies.
ContributorsTellefsen, Thor (Author) / Sullivan, Kenneth (Thesis advisor) / Kashiwagi, Dean (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2011
152223-Thumbnail Image.png
Description
Nowadays product reliability becomes the top concern of the manufacturers and customers always prefer the products with good performances under long period. In order to estimate the lifetime of the product, accelerated life testing (ALT) is introduced because most of the products can last years even decades. Much research has

Nowadays product reliability becomes the top concern of the manufacturers and customers always prefer the products with good performances under long period. In order to estimate the lifetime of the product, accelerated life testing (ALT) is introduced because most of the products can last years even decades. Much research has been done in the ALT area and optimal design for ALT is a major topic. This dissertation consists of three main studies. First, a methodology of finding optimal design for ALT with right censoring and interval censoring have been developed and it employs the proportional hazard (PH) model and generalized linear model (GLM) to simplify the computational process. A sensitivity study is also given to show the effects brought by parameters to the designs. Second, an extended version of I-optimal design for ALT is discussed and then a dual-objective design criterion is defined and showed with several examples. Also in order to evaluate different candidate designs, several graphical tools are developed. Finally, when there are more than one models available, different model checking designs are discussed.
ContributorsYang, Tao (Author) / Pan, Rong (Thesis advisor) / Montgomery, Douglas C. (Committee member) / Borror, Connie (Committee member) / Rigdon, Steve (Committee member) / Arizona State University (Publisher)
Created2013
152185-Thumbnail Image.png
Description
Over the past couple of decades, quality has been an area of increased focus. Multiple models and approaches have been proposed to measure the quality in the construction industry. This paper focuses on determining the quality of one of the types of roofing systems used in the construction industry, i.e.

Over the past couple of decades, quality has been an area of increased focus. Multiple models and approaches have been proposed to measure the quality in the construction industry. This paper focuses on determining the quality of one of the types of roofing systems used in the construction industry, i.e. Sprayed Polyurethane Foam Roofs (SPF roofs). Thirty seven urethane coated SPF roofs that were installed in 2005 / 2006 were visually inspected to measure the percentage of blisters and repairs three times over a period of 4 year, 6 year and 7 year marks. A repairing criteria was established after a 6 year mark based on the data that were reported to contractors as vulnerable roofs. Furthermore, the relation between four possible contributing time of installation factors i.e. contractor, demographics, season, and difficulty (number of penetrations and size of the roof in square feet) that could affect the quality of the roof was determined. Demographics and difficulty did not affect the quality of the roofs whereas the contractor and the season when the roof was installed did affect the quality of the roofs.
ContributorsGajjar, Dhaval (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2013
151511-Thumbnail Image.png
Description
With the increase in computing power and availability of data, there has never been a greater need to understand data and make decisions from it. Traditional statistical techniques may not be adequate to handle the size of today's data or the complexities of the information hidden within the data. Thus

With the increase in computing power and availability of data, there has never been a greater need to understand data and make decisions from it. Traditional statistical techniques may not be adequate to handle the size of today's data or the complexities of the information hidden within the data. Thus knowledge discovery by machine learning techniques is necessary if we want to better understand information from data. In this dissertation, we explore the topics of asymmetric loss and asymmetric data in machine learning and propose new algorithms as solutions to some of the problems in these topics. We also studied variable selection of matched data sets and proposed a solution when there is non-linearity in the matched data. The research is divided into three parts. The first part addresses the problem of asymmetric loss. A proposed asymmetric support vector machine (aSVM) is used to predict specific classes with high accuracy. aSVM was shown to produce higher precision than a regular SVM. The second part addresses asymmetric data sets where variables are only predictive for a subset of the predictor classes. Asymmetric Random Forest (ARF) was proposed to detect these kinds of variables. The third part explores variable selection for matched data sets. Matched Random Forest (MRF) was proposed to find variables that are able to distinguish case and control without the restrictions that exists in linear models. MRF detects variables that are able to distinguish case and control even in the presence of interaction and qualitative variables.
ContributorsKoh, Derek (Author) / Runger, George C. (Thesis advisor) / Wu, Tong (Committee member) / Pan, Rong (Committee member) / Cesta, John (Committee member) / Arizona State University (Publisher)
Created2013
151282-Thumbnail Image.png
Description
The goal of this research study was to identify the competencies the Project Manager (PM) will need to respond to the challenges the construction industry faces in 2022 and beyond. The study revealed twenty-one emerging challenges for construction PMs grouped into four primary disruptive forces: workforce demographics, globalization, rapidly evolving

The goal of this research study was to identify the competencies the Project Manager (PM) will need to respond to the challenges the construction industry faces in 2022 and beyond. The study revealed twenty-one emerging challenges for construction PMs grouped into four primary disruptive forces: workforce demographics, globalization, rapidly evolving technology, and changing organizational structures. The future PM will respond to these emerging challenges using a combination of fourteen competencies. The competencies are grouped into four categories: technical (multi-disciplined, practical understanding of technology), management (keen business insight, understanding of project management, knowledge network building, continuous risk monitoring), cognitive (complex decisions making, emotional maturity, effective communication), and leadership (leveraging diverse thinking, building relationships, engaging others, mentoring, building trust). Popular data collection methods used in project management research, such as surveys and interviews, have received criticism about the differences between stated responses to questions, what respondents say they will do, and revealed preferences, what they actually practice in the workplace. Rather than relying on surveys, this research study utilized information generated from games and exercises bundled into one-day training seminars conducted by Construction Industry Institute (CII) companies for current and upcoming generations of PMs. Educational games and exercises provide participants with the opportunity to apply classroom learning and workplace experience to resolve issues presented in real-world scenarios, providing responses that are more closely aligned with the actual decisions and activities occurring on projects. The future competencies were identified by combining results of the literature review with information from the games and exercises through an iterative cycle of data mining, analysis, and consolidation review sessions with CII members. This competency forecast will be used as a basis for company recruiting and to create tools for professional development programs and project management education at the university level. In addition to the competency forecast, the research identified simulation games and exercises as components of a project management development program in a classroom setting. An instrument that links the emerging challenges with the fourteen competencies and learning tools that facilitate the mastering of these competencies has also been developed.
ContributorsKing, Cynthia Joyce (Author) / Wiezel, Avi (Thesis advisor) / Badger, William (Committee member) / Sullivan, Kenneth (Committee member) / Arizona State University (Publisher)
Created2012