Matching Items (235)
150449-Thumbnail Image.png
Description
Current information on successful leadership and management practices is contradictory and inconsistent, which makes difficult to understand what successful business practices are and what are not. The purpose of this study is to identify a simple process that quickly and logically identifies consistent and inconsistent leadership and management criteria. The

Current information on successful leadership and management practices is contradictory and inconsistent, which makes difficult to understand what successful business practices are and what are not. The purpose of this study is to identify a simple process that quickly and logically identifies consistent and inconsistent leadership and management criteria. The hypothesis proposed is that Information Measurement Theory (IMT) along with the Kashiwagi Solution Model (KSM) is a methodology than can differentiate between accurate and inaccurate principles the initial part of the study about authors in these areas show how information is conflictive, and also served to establish an initial baseline of recommended practices aligned with IMT. The one author that excels in comparison to the rest suits the "Initial Baseline Matrix from Deming" which composes the first model. The second model is denominated the "Full Extended KSM-Matrix" composed of all the LS characteristics found among all authors and IMT. Both models were tested-out for accuracy. The second part of the study was directed to evaluate the perception of individuals on these principles. Two different groups were evaluated, one group of people that had prior training and knowledge of IMT; another group of people without any knowledge of IMT. The results of the survey showed more confusion in the group of people without knowledge to IMT and improved consistency and less variation in the group of people with knowledge in IMT. The third part of the study, the analysis of case studies of success and failure, identified principles as contributors, and categorized them into LS/type "A" characteristics and RS/type "C" characteristics, by applying the KSM. The results validated the initial proposal and led to the conclusion that practices that fall into the LS side of the KSM will lead to success, while practices that fall into the RS of the KSM will lead to failure. The comparison and testing of both models indicated a dominant support of the IMT concepts as contributors to success; while the KSM model has a higher accuracy of prediction.
ContributorsReynolds, Harry (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Badger, William (Committee member) / Arizona State University (Publisher)
Created2011
150784-Thumbnail Image.png
Description
In this work, the vapor transport and aerobic bio-attenuation of compounds from a multi-component petroleum vapor mixture were studied for six idealized lithologies in 1.8-m tall laboratory soil columns. Columns representing different geological settings were prepared using 20-40 mesh sand (medium-grained) and 16-minus mesh crushed granite (fine-grained). The contaminant vapor

In this work, the vapor transport and aerobic bio-attenuation of compounds from a multi-component petroleum vapor mixture were studied for six idealized lithologies in 1.8-m tall laboratory soil columns. Columns representing different geological settings were prepared using 20-40 mesh sand (medium-grained) and 16-minus mesh crushed granite (fine-grained). The contaminant vapor source was a liquid composed of twelve petroleum hydrocarbons common in weathered gasoline. It was placed in a chamber at the bottom of each column and the vapors diffused upward through the soil to the top where they were swept away with humidified gas. The experiment was conducted in three phases: i) nitrogen sweep gas; ii) air sweep gas; iii) vapor source concentrations decreased by ten times from the original concentrations and under air sweep gas. Oxygen, carbon dioxide and hydrocarbon concentrations were monitored over time. The data allowed determination of times to reach steady conditions, effluent mass emissions and concentration profiles. Times to reach near-steady conditions were consistent with theory and chemical-specific properties. First-order degradation rates were highest for straight-chain alkanes and aromatic hydrocarbons. Normalized effluent mass emissions were lower for lower source concentration and aerobic conditions. At the end of the study, soil core samples were taken every 6 in. Soil moisture content analyses showed that water had redistributed in the soil during the experiment. The soil at the bottom of the columns generally had higher moisture contents than initial values, and soil at the top had lower moisture contents. Profiles of the number of colony forming units of hydrocarbon-utilizing bacteria/g-soil indicated that the highest concentrations of degraders were located at the vertical intervals where maximum degradation activity was suggested by CO2 profiles. Finally, the near-steady conditions of each phase of the study were simulated using a three-dimensional transient numerical model. The model was fit to the Phase I data by adjusting soil properties, and then fit to Phase III data to obtain compound-specific first-order biodegradation rate constants ranging from 0.0 to 5.7x103 d-1.
ContributorsEscobar Melendez, Elsy (Author) / Johnson, Paul C. (Thesis advisor) / Andino, Jean (Committee member) / Forzani, Erica (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Kavazanjian, Edward (Committee member) / Arizona State University (Publisher)
Created2012
150498-Thumbnail Image.png
Description
Contamination by chlorinated ethenes is widespread in groundwater aquifers, sediment, and soils worldwide. The overarching objectives of my research were to understand how the bacterial genus Dehalococcoides function optimally to carry out reductive dechlorination of chlorinated ethenes in a mixed microbial community and then apply this knowledge to manage dechlorinating

Contamination by chlorinated ethenes is widespread in groundwater aquifers, sediment, and soils worldwide. The overarching objectives of my research were to understand how the bacterial genus Dehalococcoides function optimally to carry out reductive dechlorination of chlorinated ethenes in a mixed microbial community and then apply this knowledge to manage dechlorinating communities in the hydrogen-based membrane biofilm reactor (MBfR). The MBfR is used for the biological reduction of oxidized contaminants in water using hydrogen supplied as the electron donor by diffusion through gas-transfer fibers. First, I characterized a new anaerobic dechlorinating community developed in our laboratory, named DehaloR^2, in terms of chlorinated ethene turnover rates and assessed its microbial community composition. I then carried out an experiment to correlate performance and community structure for trichloroethene (TCE)-fed microbial consortia. Fill-and-draw reactors inoculated with DehaloR^2 demonstrated a direct correlation between microbial community function and structure as the TCE-pulsing rate was increased. An electron-balance analysis predicted the community structure based on measured concentrations of products and constant net yields for each microorganism. The predictions corresponded to trends in the community structure based on pyrosequencing and quantitative PCR up to the highest TCE pulsing rate, where deviations to the trend resulted from stress by the chlorinated ethenes. Next, I optimized a method for simultaneous detection of chlorinated ethenes and ethene at or below the Environmental Protection Agency maximum contaminant levels for groundwater using solid phase microextraction in a gas chromatograph with a flame ionization detector. This method is ideal for monitoring biological reductive dechlorination in groundwater, where ethene is the ultimate end product. The major advantage of this method is that it uses a small sample volume of 1 mL, making it ideally suited for bench-scale feasibility studies, such as the MBfR. Last, I developed a reliable start-up and operation strategy for TCE reduction in the MBfR. Successful operation relied on controlling the pH-increase effects of methanogenesis and homoacetogenesis, along with creating hydrogen limitation during start-up to allow dechlorinators to compete against other microorgansims. Methanogens were additionally minimized during continuous flow operation by a limitation in bicarbonate resulting from strong homoacetogenic activity.
ContributorsZiv-El, Michal (Author) / Rittmann, Bruce E. (Thesis advisor) / Krajmalnik-Brown, Rosa (Thesis advisor) / Halden, Rolf U. (Committee member) / Arizona State University (Publisher)
Created2012
150481-Thumbnail Image.png
Description
The overall goal of this dissertation is to advance understanding of biofilm reduction of oxidized contaminants in water and wastewater. Chapter 1 introduces the fundamentals of biological reduction of three oxidized contaminants (nitrate, perchlorate, and trichloriethene (TCE)) using two biofilm processes (hydrogen-based membrane biofilm reactors (MBfR) and packed-bed heterotrophic reactors

The overall goal of this dissertation is to advance understanding of biofilm reduction of oxidized contaminants in water and wastewater. Chapter 1 introduces the fundamentals of biological reduction of three oxidized contaminants (nitrate, perchlorate, and trichloriethene (TCE)) using two biofilm processes (hydrogen-based membrane biofilm reactors (MBfR) and packed-bed heterotrophic reactors (PBHR)), and it identifies the research objectives. Chapters 2 through 6 focus on nitrate removal using the MBfR and PBHR, while chapters 7 through 10 investigate simultaneous reduction of nitrate and another oxidized compound (perchlorate, sulfate, or TCE) in the MBfR. Chapter 11 summarizes the major findings of this research. Chapters 2 and 3 demonstrate nitrate removal in a groundwater and identify the maximum nitrate loadings using a pilot-scale MBfR and a pilot-scale PBHR, respectively. Chapter 4 compares the MBfR and the PBHR for denitrification of the same nitrate-contaminated groundwater. The comparison includes the maximum nitrate loading, the effluent water quality of the denitrification reactors, and the impact of post-treatment on water quality. Chapter 5 theoretically and experimentally demonstrates that the nitrate biomass-carrier surface loading, rather than the traditionally used empty bed contact time or nitrate volumetric loading, is the primary design parameter for heterotrophic denitrification. Chapter 6 constructs a pH-control model to predict pH, alkalinity, and precipitation potential in heterotrophic or hydrogen-based autotrophic denitrification reactors. Chapter 7 develops and uses steady-state permeation tests and a mathematical model to determine the hydrogen-permeation coefficients of three fibers commonly used in the MBfR. The coefficients are then used as inputs for the three models in Chapters 8-10. Chapter 8 develops a multispecies biofilm model for simultaneous reduction of nitrate and perchlorate in the MBfR. The model quantitatively and systematically explains how operating conditions affect nitrate and perchlorate reduction and biomass distribution via four mechanisms. Chapter 9 modifies the nitrate and perchlorate model into a nitrate and sulfate model and uses it to identify operating conditions corresponding to onset of sulfate reduction. Chapter 10 modifies the nitrate and perchlorate model into a nitrate and TCE model and uses it to investigate how operating conditions affect TCE reduction and accumulation of TCE reduction intermediates.
ContributorsTang, Youneng (Author) / Rittmann, Bruce E. (Thesis advisor) / Westerhoff, Paul (Committee member) / Krajmalnik-Brown, Rosa (Committee member) / Halden, Rolf (Committee member) / Arizona State University (Publisher)
Created2012
151119-Thumbnail Image.png
Description
The spread of invasive species may be greatly affected by human responses to prior species spread, but models and estimation methods seldom explicitly consider human responses. I investigate the effects of management responses on estimates of invasive species spread rates. To do this, I create an agent-based simulation model of

The spread of invasive species may be greatly affected by human responses to prior species spread, but models and estimation methods seldom explicitly consider human responses. I investigate the effects of management responses on estimates of invasive species spread rates. To do this, I create an agent-based simulation model of an insect invasion across a county-level citrus landscape. My model provides an approximation of a complex spatial environment while allowing the "truth" to be known. The modeled environment consists of citrus orchards with insect pests dispersing among them. Insects move across the simulation environment infesting orchards, while orchard managers respond by administering insecticide according to analyst-selected behavior profiles and management responses may depend on prior invasion states. Dispersal data is generated in each simulation and used to calculate spread rate via a set of estimators selected for their predominance in the empirical literature. Spread rate is a mechanistic, emergent phenomenon measured at the population level caused by a suite of latent biological, environmental, and anthropogenic. I test the effectiveness of orchard behavior profiles on invasion suppression and evaluate the robustness of the estimators given orchard responses. I find that allowing growers to use future expectations of spread in management decisions leads to reduced spread rates. Acting in a preventative manner by applying insecticide before insects are actually present, orchards are able to lower spread rates more than by reactive behavior alone. Spread rates are highly sensitive to spatial configuration. Spatial configuration is hardly a random process, consisting of many latent factors often not accounted for in spread rate estimation. Not considering these factors may lead to an omitted variables bias and skew estimation results. The ability of spread rate estimators to predict future spread varies considerably between estimators, and with spatial configuration, invader biological parameters, and orchard behavior profile. The model suggests that understanding the latent factors inherent to dispersal is important for selecting phenomenological models of spread and interpreting estimation results. This indicates a need for caution when evaluating spread. Although standard practice, current empirical estimators may both over- and underestimate spread rate in the simulation.
ContributorsShanafelt, David William (Author) / Fenichel, Eli P (Thesis advisor) / Richards, Timothy (Committee member) / Janssen, Marco (Committee member) / Arizona State University (Publisher)
Created2012
136547-Thumbnail Image.png
Description
The introduction of novel information technology within contemporary healthcare settings presents a critical juncture for the industry and thus lends itself to the importance of better understanding the impact of this emerging "health 2.0" landscape. Simply, how such technology may affect the healthcare system is still not fully realized, despite

The introduction of novel information technology within contemporary healthcare settings presents a critical juncture for the industry and thus lends itself to the importance of better understanding the impact of this emerging "health 2.0" landscape. Simply, how such technology may affect the healthcare system is still not fully realized, despite the ever-growing need to adopt it in order to serve a growing patient population. Thus, two pertinent questions are posed: is HIT useful and practical and, if so, what is the best way to implement it? This study examined the clinical implementation of specific instances of health information technology (HIT) so as to weigh its benefits and risks to ultimately construct a proposal for successful widespread adoption. Due to the poignancy of information analysis within HIT, Information Measurement Theory (IMT) was used to measure the effectiveness of current HIT systems as well as to elucidate improvements for future implementation. The results indicate that increased transparency, attention to patient-focused approaches and proper IT training will not only allow HIT to better serve the community, but will also decrease inefficient healthcare expenditure.
ContributorsMaietta, Myles Anthony (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor) / School of Life Sciences (Contributor)
Created2015-05
136548-Thumbnail Image.png
Description
The value of data in the construction industry is driven by the actual worth or usefulness the data can provide. The revolutionary method of Best Value Performance Information Procurement System implemented into the industry by the Performance Based Studies Research Group at ASU optimizes the value of data. By simplifying

The value of data in the construction industry is driven by the actual worth or usefulness the data can provide. The revolutionary method of Best Value Performance Information Procurement System implemented into the industry by the Performance Based Studies Research Group at ASU optimizes the value of data. By simplifying the details and complexity of a construction project through dominant and logical thinking, the Best Value system delivers efficient, non-risk success. The Best Value model's implementation into industry projects is observed in the PBSRG Minnesota projects in order to improve data collection and metric analysis. The Minnesota projects specifically have an issue with delivering Best Value transparency, the notion that the details of project data should be used to support dominant ideas. By improving and simplifying the data collection tools of PBSRG, Best Value transparency can be achieved more easily and effective, in turn improved the Best Value system.
ContributorsMisiak, Erik Richard (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Barrett, The Honors College (Contributor) / Mechanical and Aerospace Engineering Program (Contributor)
Created2015-05
136561-Thumbnail Image.png
Description
The current model of revenue generation for some free to play video games is preventing the companies controlling them from growing, but with a few changes in approach these issues could be alleviated. A new style of video games, called a MOBA (Massive Online Battle Arena) has emerged in the

The current model of revenue generation for some free to play video games is preventing the companies controlling them from growing, but with a few changes in approach these issues could be alleviated. A new style of video games, called a MOBA (Massive Online Battle Arena) has emerged in the past few years bringing with it a new style of generating wealth. Contrary to past gaming models, where users must either purchase the game outright, view advertisements, or purchase items to gain a competitive advantage, MOBAs require no payment of any kind. These are free to play computer games that provides users with all the tools necessary to compete with anyone free of charge; no advantages can be purchased in this game. This leaves the only way for users to provide money to the company through optional purchases of purely aesthetic items, only to be purchased if the buyer wishes to see their character in a different set of attire. The genre’s best in show—called League of Legends, or LOL—has spearheaded this method of revenue-generation. Fortunately for LOL, its level of popularity has reached levels never seen in video games: the world championships had more viewers than game 7 of the NBA Finals (Dorsey). The player base alone is enough to keep the company afloat currently, but the fact that they only convert 3.75% of the players into revenue is alarming. Each player brings the company an average of $1.32, or 30% of what some other free to play games earn per user (Comparing MMO). It is this low per player income that has caused Riot Games, the developer of LOL, to state that their e-sports division is not currently profitable. To resolve this issue, LOL must take on a more aggressive marketing plan. Advertisements for the NBA Finals cost $460,000 for 30 seconds, and LOL should aim for ads in this range (Lombardo). With an average of 3 million people logged on at any time, 90% of the players being male and 85% being between the ages of 16 and 30, advertising via this game would appeal to many companies, making a deal easy to strike (LOL infographic 2012). The idea also appeals to players: 81% of players surveyed said that an advertisement on the client that allows for the option to place an order would improve or not impact their experience. Moving forward with this, the gaming client would be updated to contain both an option to order pizza and an advertisement for Mountain Dew. This type of advertising was determined based on community responses through a sequence of survey questions. These small adjustments to the game would allow LOL to generate enough income for Riot Games to expand into other areas of the e-sports industry.
ContributorsSeip, Patrick (Co-author) / Zhao, BoNing (Co-author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Barrett, The Honors College (Contributor) / Sandra Day O'Connor College of Law (Contributor) / Department of Economics (Contributor) / Department of Supply Chain Management (Contributor)
Created2015-05
136312-Thumbnail Image.png
Description
While not officially recognized as an addictive activity by the Diagnostic and Statistical Manual of Mental Disorders, video game addiction has well-documented resources pointing to its effects on physiological and mental health for both addict and those close to the addict. With the rise of eSports, treating video game addiction

While not officially recognized as an addictive activity by the Diagnostic and Statistical Manual of Mental Disorders, video game addiction has well-documented resources pointing to its effects on physiological and mental health for both addict and those close to the addict. With the rise of eSports, treating video game addiction has become trickier as a passionate and growing fan base begins to act as a culture not unlike traditional sporting. These concerns call for a better understanding of what constitutes a harmful addiction to video games as its heavy practice becomes more financially viable and accepted into mainstream culture.
ContributorsGohil, Abhishek Bhagirathsinh (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2015-05
136500-Thumbnail Image.png
Description
Ethanol is a widely used biofuel in the United States that is typically produced through the fermentation of biomass feedstocks. Demand for ethanol has grown significantly from 2000 to 2015 chiefly due to a desire to increase energy independence and reduce the emissions of greenhouse gases associated with transportation. As

Ethanol is a widely used biofuel in the United States that is typically produced through the fermentation of biomass feedstocks. Demand for ethanol has grown significantly from 2000 to 2015 chiefly due to a desire to increase energy independence and reduce the emissions of greenhouse gases associated with transportation. As demand grows, new ethanol plants must be developed in order for supply to meet demand. This report covers some of the major considerations in developing these new plants such as the type of biomass used, feed treatment process, and product separation and investigates their effect on the economic viability and environmental benefits of the ethanol produced. The dry grind process for producing ethanol from corn, the most common method of production, is examined in greater detail. Analysis indicates that this process currently has the highest capacity for production and profitability but limited effect on greenhouse gas emissions compared to less common alternatives.
ContributorsSchrilla, John Paul (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Barrett, The Honors College (Contributor) / Chemical Engineering Program (Contributor)
Created2015-05