Matching Items (6)
Filtering by

Clear all filters

152208-Thumbnail Image.png
Description
Vehicle type choice is a significant determinant of fuel consumption and energy sustainability; larger, heavier vehicles consume more fuel, and expel twice as many pollutants, than their smaller, lighter counterparts. Over the course of the past few decades, vehicle type choice has seen a vast shift, due to many households

Vehicle type choice is a significant determinant of fuel consumption and energy sustainability; larger, heavier vehicles consume more fuel, and expel twice as many pollutants, than their smaller, lighter counterparts. Over the course of the past few decades, vehicle type choice has seen a vast shift, due to many households making more trips in larger vehicles with lower fuel economy. During the 1990s, SUVs were the fastest growing segment of the automotive industry, comprising 7% of the total light vehicle market in 1990, and 25% in 2005. More recently, due to rising oil prices, greater awareness to environmental sensitivity, the desire to reduce dependence on foreign oil, and the availability of new vehicle technologies, many households are considering the use of newer vehicles with better fuel economy, such as hybrids and electric vehicles, over the use of the SUV or low fuel economy vehicles they may already own. The goal of this research is to examine how vehicle miles traveled, fuel consumption and emissions may be reduced through shifts in vehicle type choice behavior. Using the 2009 National Household Travel Survey data it is possible to develop a model to estimate household travel demand and total fuel consumption. If given a vehicle choice shift scenario, using the model it would be possible to calculate the potential fuel consumption savings that would result from such a shift. In this way, it is possible to estimate fuel consumption reductions that would take place under a wide variety of scenarios.
ContributorsChristian, Keith (Author) / Pendyala, Ram M. (Thesis advisor) / Chester, Mikhail (Committee member) / Kaloush, Kamil (Committee member) / Ahn, Soyoung (Committee member) / Arizona State University (Publisher)
Created2013
151673-Thumbnail Image.png
Description
Life Cycle Assessment (LCA) quantifies environmental impacts of products in raw material extraction, processing, manufacturing, distribution, use and final disposal. The findings of an LCA can be used to improve industry practices, to aid in product development, and guide public policy. Unfortunately, existing approaches to LCA are unreliable in the

Life Cycle Assessment (LCA) quantifies environmental impacts of products in raw material extraction, processing, manufacturing, distribution, use and final disposal. The findings of an LCA can be used to improve industry practices, to aid in product development, and guide public policy. Unfortunately, existing approaches to LCA are unreliable in the cases of emerging technologies, where data is unavailable and rapid technological advances outstrip environmental knowledge. Previous studies have demonstrated several shortcomings to existing practices, including the masking of environmental impacts, the difficulty of selecting appropriate weight sets for multi-stakeholder problems, and difficulties in exploration of variability and uncertainty. In particular, there is an acute need for decision-driven interpretation methods that can guide decision makers towards making balanced, environmentally sound decisions in instances of high uncertainty. We propose the first major methodological innovation in LCA since early establishment of LCA as the analytical perspective of choice in problems of environmental management. We propose to couple stochastic multi-criteria decision analytic tools with existing approaches to inventory building and characterization to create a robust approach to comparative technology assessment in the context of high uncertainty, rapid technological change, and evolving stakeholder values. Namely, this study introduces a novel method known as Stochastic Multi-attribute Analysis for Life Cycle Impact Assessment (SMAA-LCIA) that uses internal normalization by means of outranking and exploration of feasible weight spaces.
ContributorsPrado, Valentina (Author) / Seager, Thomas P (Thesis advisor) / Landis, Amy E. (Committee member) / Chester, Mikhail (Committee member) / White, Philip (Committee member) / Arizona State University (Publisher)
Created2013
156828-Thumbnail Image.png
Description
Infrastructure are increasingly being recognized as too rigid to quickly adapt to a changing climate and a non-stationary future. This rigidness poses risks to and impacts on infrastructure service delivery and public welfare. Adaptivity in infrastructure is critical for managing uncertainties to continue providing services, yet little is known about

Infrastructure are increasingly being recognized as too rigid to quickly adapt to a changing climate and a non-stationary future. This rigidness poses risks to and impacts on infrastructure service delivery and public welfare. Adaptivity in infrastructure is critical for managing uncertainties to continue providing services, yet little is known about how infrastructure can be made more agile and flexible towards improved adaptive capacity. A literature review identified approximately fifty examples of novel infrastructure and technologies which support adaptivity through one or more of ten theoretical competencies of adaptive infrastructure. From these examples emerged several infrastructure forms and possible strategies for adaptivity, including smart technologies, combined centralized/decentralized organizational structures, and renewable electricity generation. With institutional and cultural support, such novel structures and systems have the potential to transform infrastructure provision and management.
ContributorsGilrein, Erica (Author) / Chester, Mikhail (Thesis advisor) / Garcia, Margaret (Committee member) / Allenby, Braden (Committee member) / Arizona State University (Publisher)
Created2018
156772-Thumbnail Image.png
Description

Motivated by the need for cities to prepare and be resilient to unpredictable future weather conditions, this dissertation advances a novel infrastructure development theory of “safe-to-fail” to increase the adaptive capacity of cities to climate change. Current infrastructure development is primarily reliant on identifying probable risks to engineered systems and

Motivated by the need for cities to prepare and be resilient to unpredictable future weather conditions, this dissertation advances a novel infrastructure development theory of “safe-to-fail” to increase the adaptive capacity of cities to climate change. Current infrastructure development is primarily reliant on identifying probable risks to engineered systems and making infrastructure reliable to maintain its function up to a designed system capacity. However, alterations happening in the earth system (e.g., atmosphere, oceans, land, and ice) and in human systems (e.g., greenhouse gas emission, population, land-use, technology, and natural resource use) are increasing the uncertainties in weather predictions and risk calculations and making it difficult for engineered infrastructure to maintain intended design thresholds in non-stationary future. This dissertation presents a new way to develop safe-to-fail infrastructure that departs from the current practice of risk calculation and is able to manage failure consequences when unpredicted risks overwhelm engineered systems.

This dissertation 1) defines infrastructure failure, refines existing safe-to-fail theory, and compares decision considerations for safe-to-fail vs. fail-safe infrastructure development under non-stationary climate; 2) suggests an approach to integrate the estimation of infrastructure failure impacts with extreme weather risks; 3) provides a decision tool to implement resilience strategies into safe-to-fail infrastructure development; and, 4) recognizes diverse perspectives for adopting safe-to-fail theory into practice in various decision contexts.

Overall, this dissertation advances safe-to-fail theory to help guide climate adaptation decisions that consider infrastructure failure and their consequences. The results of this dissertation demonstrate an emerging need for stakeholders, including policy makers, planners, engineers, and community members, to understand an impending “infrastructure trolley problem”, where the adaptive capacity of some regions is improved at the expense of others. Safe-to-fail further engages stakeholders to bring their knowledge into the prioritization of various failure costs based on their institutional, regional, financial, and social capacity to withstand failures. This approach connects to sustainability, where city practitioners deliberately think of and include the future cost of social, environmental and economic attributes in planning and decision-making.

ContributorsKim, Yeowon (Author) / Chester, Mikhail (Thesis advisor) / Eakin, Hallie (Committee member) / Redman, Charles (Committee member) / Miller, Thaddeus R. (Committee member) / Arizona State University (Publisher)
Created2018
153486-Thumbnail Image.png
Description
Quantum resilience is a pragmatic theory that allows systems engineers to formally characterize the resilience of systems. As a generalized theory, it not only clarifies resilience in the literature, but also can be applied to all disciplines and domains of discourse. Operationalizing resilience in this manner permits decision-makers to compare

Quantum resilience is a pragmatic theory that allows systems engineers to formally characterize the resilience of systems. As a generalized theory, it not only clarifies resilience in the literature, but also can be applied to all disciplines and domains of discourse. Operationalizing resilience in this manner permits decision-makers to compare and contrast system deployment options for suitability in a variety of environments and allows for consistent treatment of resilience across domains. Systems engineers, whether planning future infrastructures or managing ecosystems, are increasingly asked to deliver resilient systems. Quantum resilience provides a way forward that allows specific resilience requirements to be specified, validated, and verified.

Quantum resilience makes two very important claims. First, resilience cannot be characterized without recognizing both the system and the valued function it provides. Second, resilience is not about disturbances, insults, threats, or perturbations. To avoid crippling infinities, characterization of resilience must be accomplishable without disturbances in mind. In light of this, quantum resilience defines resilience as the extent to which a system delivers its valued functions, and characterizes resilience as a function of system productivity and complexity. System productivity vis-à-vis specified “valued functions” involves (1) the quanta of the valued function delivered, and (2) the number of systems (within the greater system) which deliver it. System complexity is defined structurally and relationally and is a function of a variety of items including (1) system-of-systems hierarchical decomposition, (2) interfaces and connections between systems, and (3) inter-system dependencies.

Among the important features of quantum resilience is that it can be implemented in any system engineering tool that provides sufficient design and specification rigor (i.e., one that supports standards like the Lifecycle and Systems Modeling languages and frameworks like the DoD Architecture Framework). Further, this can be accomplished with minimal software development and has been demonstrated in three model-based system engineering tools, two of which are commercially available, well-respected, and widely used. This pragmatic approach assures transparency and consistency in characterization of resilience in any discipline.
ContributorsRoberts, Thomas Wade (Author) / Allenby, Braden (Thesis advisor) / Chester, Mikhail (Committee member) / Anderies, John M (Committee member) / Arizona State University (Publisher)
Created2015
153252-Thumbnail Image.png
Description
Effective collection and dissemination of project information, including best practices, help increase the likelihood of project performance and are vital to organizations in the architecture-engineering-construction (AEC) industry. Best practices can help improve project performance, yet these practices are not universally implemented and used in the industry, due to the following:

Effective collection and dissemination of project information, including best practices, help increase the likelihood of project performance and are vital to organizations in the architecture-engineering-construction (AEC) industry. Best practices can help improve project performance, yet these practices are not universally implemented and used in the industry, due to the following: 1) not all practices are applicable to every project or organization, 2) knowledge lost in organizational turnover which leads to inconsistent collection and implementation of best practices and 3) the lack of standardized processes for best practice management in an organization.

This research, sponsored by National Academy of Construction, the Construction Industry Institute and Arizona State University, used structured interviews, a Delphi study and focus groups to explore: 1) potential benefit and industry interest in an open repository of best practices and 2) important elements of a framework/model that guides the creation, management and sustainment of an open repository of best practices.

This dissertation presents findings specifically exploring the term "Practices for Excellence", its definition, elements that hinder implementation, the potential value of an open online repository for such practices and a model to develop an open repository.
ContributorsBosfield, Roberta Patrice (Author) / Gibson, Edd (Thesis advisor) / Chester, Mikhail (Committee member) / Parrish, Kristen (Committee member) / Sullivan, Kenneth (Committee member) / Arizona State University (Publisher)
Created2014