Matching Items (436)
Filtering by

Clear all filters

151994-Thumbnail Image.png
Description
Under the framework of intelligent management of power grids by leveraging advanced information, communication and control technologies, a primary objective of this study is to develop novel data mining and data processing schemes for several critical applications that can enhance the reliability of power systems. Specifically, this study is broadly

Under the framework of intelligent management of power grids by leveraging advanced information, communication and control technologies, a primary objective of this study is to develop novel data mining and data processing schemes for several critical applications that can enhance the reliability of power systems. Specifically, this study is broadly organized into the following two parts: I) spatio-temporal wind power analysis for wind generation forecast and integration, and II) data mining and information fusion of synchrophasor measurements toward secure power grids. Part I is centered around wind power generation forecast and integration. First, a spatio-temporal analysis approach for short-term wind farm generation forecasting is proposed. Specifically, using extensive measurement data from an actual wind farm, the probability distribution and the level crossing rate of wind farm generation are characterized using tools from graphical learning and time-series analysis. Built on these spatial and temporal characterizations, finite state Markov chain models are developed, and a point forecast of wind farm generation is derived using the Markov chains. Then, multi-timescale scheduling and dispatch with stochastic wind generation and opportunistic demand response is investigated. Part II focuses on incorporating the emerging synchrophasor technology into the security assessment and the post-disturbance fault diagnosis of power systems. First, a data-mining framework is developed for on-line dynamic security assessment by using adaptive ensemble decision tree learning of real-time synchrophasor measurements. Under this framework, novel on-line dynamic security assessment schemes are devised, aiming to handle various factors (including variations of operating conditions, forced system topology change, and loss of critical synchrophasor measurements) that can have significant impact on the performance of conventional data-mining based on-line DSA schemes. Then, in the context of post-disturbance analysis, fault detection and localization of line outage is investigated using a dependency graph approach. It is shown that a dependency graph for voltage phase angles can be built according to the interconnection structure of power system, and line outage events can be detected and localized through networked data fusion of the synchrophasor measurements collected from multiple locations of power grids. Along a more practical avenue, a decentralized networked data fusion scheme is proposed for efficient fault detection and localization.
ContributorsHe, Miao (Author) / Zhang, Junshan (Thesis advisor) / Vittal, Vijay (Thesis advisor) / Hedman, Kory (Committee member) / Si, Jennie (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2013
154130-Thumbnail Image.png
Description
Given the importance of buildings as major consumers of resources worldwide, several organizations are working avidly to ensure the negative impacts of buildings are minimized. The U.S. Green Building Council's (USGBC) Leadership in Energy and Environmental Design (LEED) rating system is one such effort to recognize buildings that are designed

Given the importance of buildings as major consumers of resources worldwide, several organizations are working avidly to ensure the negative impacts of buildings are minimized. The U.S. Green Building Council's (USGBC) Leadership in Energy and Environmental Design (LEED) rating system is one such effort to recognize buildings that are designed to achieve a superior performance in several areas including energy consumption and indoor environmental quality (IEQ). The primary objectives of this study are to investigate the performance of LEED certified facilities in terms of energy consumption and occupant satisfaction with IEQ, and introduce a framework to assess the performance of LEED certified buildings.

This thesis attempts to achieve the research objectives by examining the LEED certified buildings on the Arizona State University (ASU) campus in Tempe, AZ, from two complementary perspectives: the Macro-level and the Micro-level. Heating, cooling, and electricity data were collected from the LEED-certified buildings on campus, and their energy use intensity was calculated in order to investigate the buildings' actual energy performance. Additionally, IEQ occupant satisfaction surveys were used to investigate users' satisfaction with the space layout, space furniture, thermal comfort, indoor air quality, lighting level, acoustic quality, water efficiency, cleanliness and maintenance of the facilities they occupy.

From a Macro-level perspective, the results suggest ASU LEED buildings consume less energy than regional counterparts, and exhibit higher occupant satisfaction than national counterparts. The occupant satisfaction results are in line with the literature on LEED buildings, whereas the energy results contribute to the inconclusive body of knowledge on energy performance improvements linked to LEED certification. From a Micro-level perspective, data analysis suggest an inconsistency between the LEED points earned for the Energy & Atmosphere and IEQ categories, on one hand, and the respective levels of energy consumption and occupant satisfaction on the other hand. Accordingly, this study showcases the variation in the performance results when approached from different perspectives. This contribution highlights the need to consider the Macro-level and Micro-level assessments in tandem, and assess LEED building performance from these two distinct but complementary perspectives in order to develop a more comprehensive understanding of the actual building performance.
ContributorsChokor, Abbas (Author) / El Asmar, Mounir (Thesis advisor) / Chong, Oswald (Committee member) / Parrish, Kristen (Committee member) / Arizona State University (Publisher)
Created2015
156457-Thumbnail Image.png
Description
Resilience is emerging as the preferred way to improve the protection of infrastructure systems beyond established risk management practices. Massive damages experienced during tragedies like Hurricane Katrina showed that risk analysis is incapable to prevent unforeseen infrastructure failures and shifted expert focus towards resilience to absorb and recover from adverse

Resilience is emerging as the preferred way to improve the protection of infrastructure systems beyond established risk management practices. Massive damages experienced during tragedies like Hurricane Katrina showed that risk analysis is incapable to prevent unforeseen infrastructure failures and shifted expert focus towards resilience to absorb and recover from adverse events. Recent, exponential growth in research is now producing consensus on how to think about infrastructure resilience centered on definitions and models from influential organizations like the US National Academy of Sciences. Despite widespread efforts, massive infrastructure failures in 2017 demonstrate that resilience is still not working, raising the question: Are the ways people think about resilience producing resilient infrastructure systems?



This dissertation argues that established thinking harbors misconceptions about infrastructure systems that diminish attempts to improve their resilience. Widespread efforts based on the current canon focus on improving data analytics, establishing resilience goals, reducing failure probabilities, and measuring cascading losses. Unfortunately, none of these pursuits change the resilience of an infrastructure system, because none of them result in knowledge about how data is used, goals are set, or failures occur. Through the examination of each misconception, this dissertation results in practical, new approaches for infrastructure systems to respond to unforeseen failures via sensing, adapting, and anticipating processes. Specifically, infrastructure resilience is improved by sensing when data analytics include the modeler-in-the-loop, adapting to stress contexts by switching between multiple resilience strategies, and anticipating crisis coordination activities prior to experiencing a failure.

Overall, results demonstrate that current resilience thinking needs to change because it does not differentiate resilience from risk. The majority of research thinks resilience is a property that a system has, like a noun, when resilience is really an action a system does, like a verb. Treating resilience as a noun only strengthens commitment to risk-based practices that do not protect infrastructure from unknown events. Instead, switching to thinking about resilience as a verb overcomes prevalent misconceptions about data, goals, systems, and failures, and may bring a necessary, radical change to the way infrastructure is protected in the future.
ContributorsEisenberg, Daniel Alexander (Author) / Seager, Thomas P. (Thesis advisor) / Park, Jeryang (Thesis advisor) / Alderson, David L. (Committee member) / Lai, Ying-Cheng (Committee member) / Arizona State University (Publisher)
Created2018
156827-Thumbnail Image.png
Description
Our daily life is becoming more and more reliant on services provided by the infrastructures

power, gas , communication networks. Ensuring the security of these

infrastructures is of utmost importance. This task becomes ever more challenging as

the inter-dependence among these infrastructures grows and a security breach in one

infrastructure can spill over to

Our daily life is becoming more and more reliant on services provided by the infrastructures

power, gas , communication networks. Ensuring the security of these

infrastructures is of utmost importance. This task becomes ever more challenging as

the inter-dependence among these infrastructures grows and a security breach in one

infrastructure can spill over to the others. The implication is that the security practices/

analysis recommended for these infrastructures should be done in coordination.

This thesis, focusing on the power grid, explores strategies to secure the system that

look into the coupling of the power grid to the cyber infrastructure, used to manage

and control it, and to the gas grid, that supplies an increasing amount of reserves to

overcome contingencies.

The first part (Part I) of the thesis, including chapters 2 through 4, focuses on

the coupling of the power and the cyber infrastructure that is used for its control and

operations. The goal is to detect malicious attacks gaining information about the

operation of the power grid to later attack the system. In chapter 2, we propose a

hierarchical architecture that correlates the analysis of high resolution Micro-Phasor

Measurement Unit (microPMU) data and traffic analysis on the Supervisory Control

and Data Acquisition (SCADA) packets, to infer the security status of the grid and

detect the presence of possible intruders. An essential part of this architecture is

tied to the analysis on the microPMU data. In chapter 3 we establish a set of anomaly

detection rules on microPMU data that

flag "abnormal behavior". A placement strategy

of microPMU sensors is also proposed to maximize the sensitivity in detecting anomalies.

In chapter 4, we focus on developing rules that can localize the source of an events

using microPMU to further check whether a cyber attack is causing the anomaly, by

correlating SCADA traffic with the microPMU data analysis results. The thread that

unies the data analysis in this chapter is the fact that decision are made without fully estimating the state of the system; on the contrary, decisions are made using

a set of physical measurements that falls short by orders of magnitude to meet the

needs for observability. More specifically, in the first part of this chapter (sections 4.1-

4.2), using microPMU data in the substation, methodologies for online identification of

the source Thevenin parameters are presented. This methodology is used to identify

reconnaissance activity on the normally-open switches in the substation, initiated

by attackers to gauge its controllability over the cyber network. The applications

of this methodology in monitoring the voltage stability of the grid is also discussed.

In the second part of this chapter (sections 4.3-4.5), we investigate the localization

of faults. Since the number of PMU sensors available to carry out the inference

is insufficient to ensure observability, the problem can be viewed as that of under-sampling

a "graph signal"; the analysis leads to a PMU placement strategy that can

achieve the highest resolution in localizing the fault, for a given number of sensors.

In both cases, the results of the analysis are leveraged in the detection of cyber-physical

attacks, where microPMU data and relevant SCADA network traffic information

are compared to determine if a network breach has affected the integrity of the system

information and/or operations.

In second part of this thesis (Part II), the security analysis considers the adequacy

and reliability of schedules for the gas and power network. The motivation for

scheduling jointly supply in gas and power networks is motivated by the increasing

reliance of power grids on natural gas generators (and, indirectly, on gas pipelines)

as providing critical reserves. Chapter 5 focuses on unveiling the challenges and

providing solution to this problem.
ContributorsJamei, Mahdi (Author) / Scaglioe, Anna (Thesis advisor) / Ayyanar, Raja (Committee member) / Hedman, Kory W (Committee member) / Kosut, Oliver (Committee member) / Arizona State University (Publisher)
Created2018
156859-Thumbnail Image.png
Description
The analysis of clinical workflow offers many challenges to clinical stakeholders and researchers, especially in environments characterized by dynamic and concurrent processes. Workflow analysis in such environments is essential for monitoring performance and finding bottlenecks and sources of error. Clinical workflow analysis has been enhanced with the inclusion of modern

The analysis of clinical workflow offers many challenges to clinical stakeholders and researchers, especially in environments characterized by dynamic and concurrent processes. Workflow analysis in such environments is essential for monitoring performance and finding bottlenecks and sources of error. Clinical workflow analysis has been enhanced with the inclusion of modern technologies. One such intervention is automated location tracking which is a system that detects the movement of clinicians and equipment. Utilizing the data produced from automated location tracking technologies can lead to the development of novel workflow analytics that can be used to complement more traditional approaches such as ethnography and grounded-theory based qualitative methods. The goals of this research are to: (i) develop a series of analytic techniques to derive deeper workflow-related insight in an emergency department setting, (ii) overlay data from disparate sources (quantitative and qualitative) to develop strategies that facilitate workflow redesign, and (iii) incorporate visual analytics methods to improve the targeted visual feedback received by providers based on the findings. The overarching purpose is to create a framework to demonstrate the utility of automated location tracking data used in conjunction with clinical data like EHR logs and its vital role in the future of clinical workflow analysis/analytics. This document is categorized based on two primary aims of the research. The first aim deals with the use of automated location tracking data to develop a novel methodological/exploratory framework for clinical workflow. The second aim is to overlay the quantitative data generated from the previous aim on data from qualitative observation and shadowing studies (mixed methods) to develop a deeper view of clinical workflow that can be used to facilitate workflow redesign. The final sections of the document speculate on the direction of this work where the potential of this research in the creation of fully integrated clinical environments i.e. environments with state-of-the-art location tracking and other data collection mechanisms, is discussed. The main purpose of this research is to demonstrate ways by which clinical processes can be continuously monitored allowing for proactive adaptations in the face of technological and process changes to minimize any negative impact on the quality of patient care and provider satisfaction.
ContributorsVankipuram, Akshay (Author) / Patel, Vimla L. (Thesis advisor) / Wang, Dongwen (Thesis advisor) / Shortliffe, Edward H (Committee member) / Kaufman, David R. (Committee member) / Traub, Stephen J (Committee member) / Arizona State University (Publisher)
Created2018
135433-Thumbnail Image.png
Description
For our collaborative thesis we explored the US electric utility market and how the Internet of Things technology movement could capture a possible advancement of the current existing grid. Our objective of this project was to successfully understand the market trends in the utility space and identify where a semiconductor

For our collaborative thesis we explored the US electric utility market and how the Internet of Things technology movement could capture a possible advancement of the current existing grid. Our objective of this project was to successfully understand the market trends in the utility space and identify where a semiconductor manufacturing company, with a focus on IoT technology, could penetrate the market using their products. The methodology used for our research was to conduct industry interviews to formulate common trends in the utility and industrial hardware manufacturer industries. From there, we composed various strategies that The Company should explore. These strategies were backed up using qualitative reasoning and forecasted discounted cash flow and net present value analysis. We confirmed that The Company should use specific silicon microprocessors and microcontrollers that pertained to each of the four devices analytics demand. Along with a silicon strategy, our group believes that there is a strong argument for a data analytics software package by forming strategic partnerships in this space.
ContributorsLlazani, Loris (Co-author) / Ruland, Matthew (Co-author) / Medl, Jordan (Co-author) / Crowe, David (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Mike (Committee member) / Department of Economics (Contributor) / Department of Finance (Contributor) / Department of Supply Chain Management (Contributor) / Department of Information Systems (Contributor) / Hugh Downs School of Human Communication (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136944-Thumbnail Image.png
Description
As the use of Big Data gains momentum and transitions into mainstream adoption, marketers are racing to generate valuable insights that can create well-informed strategic business decisions. The retail market is a fiercely competitive industry, and the rapid adoption of smartphones and tablets have led e-commerce rivals to grow at

As the use of Big Data gains momentum and transitions into mainstream adoption, marketers are racing to generate valuable insights that can create well-informed strategic business decisions. The retail market is a fiercely competitive industry, and the rapid adoption of smartphones and tablets have led e-commerce rivals to grow at an unbelievable rate. Retailers are able to collect and analyze data from both their physical stores and e-commerce platforms, placing them in a unique position to be able to fully capitalize on the power of Big Data. This thesis is an examination of Big Data and how marketers can use it to create better experiences for consumers. Insights generated from the use of Big Data can result in increased customer engagement, loyalty, and retention for an organization. Businesses of all sizes, whether it be enterprise, small-to-midsize, and even solely e-commerce organizations have successfully implemented Big Data technology. However, there are issues regarding challenges and the ethical and legal concerns that need to be addressed as the world continues to adopt the use of Big Data analytics and insights. With the abundance of data collected in today's digital world, marketers must take advantage of available resources to improve the overall customer experience.
ContributorsHaghgoo, Sam (Author) / Ostrom, Amy (Thesis director) / Giles, Bret (Committee member) / Barrett, The Honors College (Contributor) / Department of Marketing (Contributor) / W. P. Carey School of Business (Contributor) / Department of Management (Contributor)
Created2014-05
132857-Thumbnail Image.png
Description
Predictive analytics have been used in a wide variety of settings, including healthcare,
sports, banking, and other disciplines. We use predictive analytics and modeling to
determine the impact of certain factors that increase the probability of a successful
fourth down conversion in the Power 5 conferences. The logistic regression models

Predictive analytics have been used in a wide variety of settings, including healthcare,
sports, banking, and other disciplines. We use predictive analytics and modeling to
determine the impact of certain factors that increase the probability of a successful
fourth down conversion in the Power 5 conferences. The logistic regression models
predict the likelihood of going for fourth down with a 64% or more probability based on
2015-17 data obtained from ESPN’s college football API. Offense type though important
but non-measurable was incorporated as a random effect. We found that distance to go,
play type, field position, and week of the season were key leading covariates in
predictability. On average, our model performed as much as 14% better than coaches
in 2018.
ContributorsBlinkoff, Joshua Ian (Co-author) / Voeller, Michael (Co-author) / Wilson, Jeffrey (Thesis director) / Graham, Scottie (Committee member) / Dean, W.P. Carey School of Business (Contributor) / Department of Information Systems (Contributor) / Department of Management and Entrepreneurship (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132858-Thumbnail Image.png
Description
Predictive analytics have been used in a wide variety of settings, including healthcare, sports, banking, and other disciplines. We use predictive analytics and modeling to determine the impact of certain factors that increase the probability of a successful fourth down conversion in the Power 5 conferences. The logistic regression models

Predictive analytics have been used in a wide variety of settings, including healthcare, sports, banking, and other disciplines. We use predictive analytics and modeling to determine the impact of certain factors that increase the probability of a successful fourth down conversion in the Power 5 conferences. The logistic regression models predict the likelihood of going for fourth down with a 64% or more probability based on 2015-17 data obtained from ESPN’s college football API. Offense type though important but non-measurable was incorporated as a random effect. We found that distance to go, play type, field position, and week of the season were key leading covariates in predictability. On average, our model performed as much as 14% better than coaches in 2018.
ContributorsVoeller, Michael Jeffrey (Co-author) / Blinkoff, Josh (Co-author) / Wilson, Jeffrey (Thesis director) / Graham, Scottie (Committee member) / Department of Information Systems (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132867-Thumbnail Image.png
Description
The objective of this project was the creation of a web app for undergraduate CIS/BDA students which allows them to search for jobs based on criteria that are not always directly available with the average job search engine. This includes technical skills, soft skills, location and industry. This

The objective of this project was the creation of a web app for undergraduate CIS/BDA students which allows them to search for jobs based on criteria that are not always directly available with the average job search engine. This includes technical skills, soft skills, location and industry. This creates a more focused way for these students to search for jobs using an application that also attempts to exclude positions that are looking for very experienced employees. The activities used for this project were chosen in attempt to make as many of the processes as automatable as possible.
This was achieved by first using offline explorer, an application that can download websites, to gather job postings from Dice.com that were searched by a pre-defined list of technical skills. Next came the parsing of the downloaded postings to extract and clean the data that was required and filling a database with that cleaned data. Then the companies were matched up with their corresponding industries. This was done using their NAICS (North American Industry Classification System) codes. The descriptions were then analyzed, and a group of soft skills was chosen based on the results of Word2Vec (a group of models that assists in creating word embeddings). A master table was then created by combining all of the tables in the database. The master table was then filtered down to exclude posts that required too much experience. Lastly, the web app was created using node.js as the back-end. This web app allows the user to choose their desired criteria and navigate through the postings that meet their criteria.
ContributorsHenry, Alfred (Author) / Darcy, David (Thesis director) / Moser, Kathleen (Committee member) / Department of Information Systems (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2019-05