This collection includes most of the ASU Theses and Dissertations from 2011 to present. ASU Theses and Dissertations are available in downloadable PDF format; however, a small percentage of items are under embargo. Information about the dissertations/theses includes degree information, committee members, an abstract, supporting data or media.

In addition to the electronic theses found in the ASU Digital Repository, ASU Theses and Dissertations can be found in the ASU Library Catalog.

Dissertations and Theses granted by Arizona State University are archived and made available through a joint effort of the ASU Graduate College and the ASU Libraries. For more information or questions about this collection contact or visit the Digital Repository ETD Library Guide or contact the ASU Graduate College at gradformat@asu.edu.

Displaying 1 - 6 of 6
Filtering by

Clear all filters

151341-Thumbnail Image.png
Description
With the rapid development of mobile sensing technologies like GPS, RFID, sensors in smartphones, etc., capturing position data in the form of trajectories has become easy. Moving object trajectory analysis is a growing area of interest these days owing to its applications in various domains such as marketing, security, traffic

With the rapid development of mobile sensing technologies like GPS, RFID, sensors in smartphones, etc., capturing position data in the form of trajectories has become easy. Moving object trajectory analysis is a growing area of interest these days owing to its applications in various domains such as marketing, security, traffic monitoring and management, etc. To better understand movement behaviors from the raw mobility data, this doctoral work provides analytic models for analyzing trajectory data. As a first contribution, a model is developed to detect changes in trajectories with time. If the taxis moving in a city are viewed as sensors that provide real time information of the traffic in the city, a change in these trajectories with time can reveal that the road network has changed. To detect changes, trajectories are modeled with a Hidden Markov Model (HMM). A modified training algorithm, for parameter estimation in HMM, called m-BaumWelch, is used to develop likelihood estimates under assumed changes and used to detect changes in trajectory data with time. Data from vehicles are used to test the method for change detection. Secondly, sequential pattern mining is used to develop a model to detect changes in frequent patterns occurring in trajectory data. The aim is to answer two questions: Are the frequent patterns still frequent in the new data? If they are frequent, has the time interval distribution in the pattern changed? Two different approaches are considered for change detection, frequency-based approach and distribution-based approach. The methods are illustrated with vehicle trajectory data. Finally, a model is developed for clustering and outlier detection in semantic trajectories. A challenge with clustering semantic trajectories is that both numeric and categorical attributes are present. Another problem to be addressed while clustering is that trajectories can be of different lengths and also have missing values. A tree-based ensemble is used to address these problems. The approach is extended to outlier detection in semantic trajectories.
ContributorsKondaveeti, Anirudh (Author) / Runger, George C. (Thesis advisor) / Mirchandani, Pitu (Committee member) / Pan, Rong (Committee member) / Maciejewski, Ross (Committee member) / Arizona State University (Publisher)
Created2012
151405-Thumbnail Image.png
Description
Critical infrastructures in healthcare, power systems, and web services, incorporate cyber-physical systems (CPSes), where the software controlled computing systems interact with the physical environment through actuation and monitoring. Ensuring software safety in CPSes, to avoid hazards to property and human life as a result of un-controlled interactions, is essential and

Critical infrastructures in healthcare, power systems, and web services, incorporate cyber-physical systems (CPSes), where the software controlled computing systems interact with the physical environment through actuation and monitoring. Ensuring software safety in CPSes, to avoid hazards to property and human life as a result of un-controlled interactions, is essential and challenging. The principal hurdle in this regard is the characterization of the context driven interactions between software and the physical environment (cyber-physical interactions), which introduce multi-dimensional dynamics in space and time, complex non-linearities, and non-trivial aggregation of interaction in case of networked operations. Traditionally, CPS software is tested for safety either through experimental trials, which can be expensive, incomprehensive, and hazardous, or through static analysis of code, which ignore the cyber-physical interactions. This thesis considers model based engineering, a paradigm widely used in different disciplines of engineering, for safety verification of CPS software and contributes to three fundamental phases: a) modeling, building abstractions or models that characterize cyberphysical interactions in a mathematical framework, b) analysis, reasoning about safety based on properties of the model, and c) synthesis, implementing models on standard testbeds for performing preliminary experimental trials. In this regard, CPS modeling techniques are proposed that can accurately capture the context driven spatio-temporal aggregate cyber-physical interactions. Different levels of abstractions are considered, which result in high level architectural models, or more detailed formal behavioral models of CPSes. The outcomes include, a well defined architectural specification framework called CPS-DAS and a novel spatio-temporal formal model called Spatio-Temporal Hybrid Automata (STHA) for CPSes. Model analysis techniques are proposed for the CPS models, which can simulate the effects of dynamic context changes on non-linear spatio-temporal cyberphysical interactions, and characterize aggregate effects. The outcomes include tractable algorithms for simulation analysis and for theoretically proving safety properties of CPS software. Lastly a software synthesis technique is proposed that can automatically convert high level architectural models of CPSes in the healthcare domain into implementations in high level programming languages. The outcome is a tool called Health-Dev that can synthesize software implementations of CPS models in healthcare for experimental verification of safety properties.
ContributorsBanerjee, Ayan (Author) / Gupta, Sandeep K.S. (Thesis advisor) / Poovendran, Radha (Committee member) / Fainekos, Georgios (Committee member) / Maciejewski, Ross (Committee member) / Arizona State University (Publisher)
Created2012
152398-Thumbnail Image.png
Description
Identifying important variation patterns is a key step to identifying root causes of process variability. This gives rise to a number of challenges. First, the variation patterns might be non-linear in the measured variables, while the existing research literature has focused on linear relationships. Second, it is important to remove

Identifying important variation patterns is a key step to identifying root causes of process variability. This gives rise to a number of challenges. First, the variation patterns might be non-linear in the measured variables, while the existing research literature has focused on linear relationships. Second, it is important to remove noise from the dataset in order to visualize the true nature of the underlying patterns. Third, in addition to visualizing the pattern (preimage), it is also essential to understand the relevant features that define the process variation pattern. This dissertation considers these variation challenges. A base kernel principal component analysis (KPCA) algorithm transforms the measurements to a high-dimensional feature space where non-linear patterns in the original measurement can be handled through linear methods. However, the principal component subspace in feature space might not be well estimated (especially from noisy training data). An ensemble procedure is constructed where the final preimage is estimated as the average from bagged samples drawn from the original dataset to attenuate noise in kernel subspace estimation. This improves the robustness of any base KPCA algorithm. In a second method, successive iterations of denoising a convex combination of the training data and the corresponding denoised preimage are used to produce a more accurate estimate of the actual denoised preimage for noisy training data. The number of primary eigenvectors chosen in each iteration is also decreased at a constant rate. An efficient stopping rule criterion is used to reduce the number of iterations. A feature selection procedure for KPCA is constructed to find the set of relevant features from noisy training data. Data points are projected onto sparse random vectors. Pairs of such projections are then matched, and the differences in variation patterns within pairs are used to identify the relevant features. This approach provides robustness to irrelevant features by calculating the final variation pattern from an ensemble of feature subsets. Experiments are conducted using several simulated as well as real-life data sets. The proposed methods show significant improvement over the competitive methods.
ContributorsSahu, Anshuman (Author) / Runger, George C. (Thesis advisor) / Wu, Teresa (Committee member) / Pan, Rong (Committee member) / Maciejewski, Ross (Committee member) / Arizona State University (Publisher)
Created2013
157648-Thumbnail Image.png
Description
Conservation planning is fundamental to guarantee the survival of endangered species and to preserve the ecological values of some ecosystems. Planning land acquisitions increasingly requires a landscape approach to mitigate the negative impacts of spatial threats such as urbanization, agricultural development, and climate change. In this context, landscape connectivity and

Conservation planning is fundamental to guarantee the survival of endangered species and to preserve the ecological values of some ecosystems. Planning land acquisitions increasingly requires a landscape approach to mitigate the negative impacts of spatial threats such as urbanization, agricultural development, and climate change. In this context, landscape connectivity and compactness are vital characteristics for the effective functionality of conservation reserves. Connectivity allows species to travel across landscapes, facilitating the flow of genes across populations from different protected areas. Compactness measures the spatial dispersion of protected sites, which can be used to mitigate risk factors associated with species leaving and re-entering the reserve. This research proposes an optimization model to identify areas to protect while enforcing connectivity and compactness. In the suggested projected area, this research builds upon existing methods and develops an alternative metric of compactness that penalizes the selection of patches of land with few protected neighbors. The new metric is referred as leaf because it intends to minimize the number of selected areas with 1 neighboring protected area. The model includes budget and minimum selected area constraints to reflect realistic financial and ecological requirements. Using a lexicographic approach, the model can improve the compactness of conservation reserves obtained by other methods. The use of the model is illustrated by solving instances of up to 1100 patches.
ContributorsRavishankar, Shreyas (Author) / Sefair, Jorge A (Thesis advisor) / Askin, Ronald (Committee member) / Maciejewski, Ross (Committee member) / Arizona State University (Publisher)
Created2019
155738-Thumbnail Image.png
Description
Testing and Verification of Cyber-Physical Systems (CPS) is a challenging problem. The challenge arises as a result of the complex interactions between the components of these systems: the digital control, and the physical environment. Furthermore, the software complexity that governs the high-level control logic in these systems is increasing day

Testing and Verification of Cyber-Physical Systems (CPS) is a challenging problem. The challenge arises as a result of the complex interactions between the components of these systems: the digital control, and the physical environment. Furthermore, the software complexity that governs the high-level control logic in these systems is increasing day by day. As a result, in recent years, both the academic community and the industry have been heavily invested in developing tools and methodologies for the development of safety-critical systems. One scalable approach in testing and verification of these systems is through guided system simulation using stochastic optimization techniques. The goal of the stochastic optimizer is to find system behavior that does not meet the intended specifications.

In this dissertation, three methods that facilitate the testing and verification process for CPS are presented:

1. A graphical formalism and tool which enables the elicitation of formal requirements. To evaluate the performance of the tool, a usability study is conducted.

2. A parameter mining method to infer, analyze, and visually represent falsifying ranges for parametrized system specifications.

3. A notion of conformance between a CPS model and implementation along with a testing framework.

The methods are evaluated over high-fidelity case studies from the industry.
ContributorsHoxha, Bardh (Author) / Fainekos, Georgios (Thesis advisor) / Sarjoughian, Hessam S. (Committee member) / Maciejewski, Ross (Committee member) / Ben Amor, Heni (Committee member) / Arizona State University (Publisher)
Created2017
158602-Thumbnail Image.png
Description
Short-notice disasters such as hurricanes involve uncertainties in many facets, from the time of its occurrence to its impacts’ magnitude. Failure to incorporate these uncertainties can affect the effectiveness of the emergency responses. In the case of a hurricane event, uncertainties and corresponding impacts during a storm event can quickly

Short-notice disasters such as hurricanes involve uncertainties in many facets, from the time of its occurrence to its impacts’ magnitude. Failure to incorporate these uncertainties can affect the effectiveness of the emergency responses. In the case of a hurricane event, uncertainties and corresponding impacts during a storm event can quickly cascade. Over the past decades, various storm forecast models have been developed to predict the storm uncertainties; however, access to the usage of these models is limited. Hence, as the first part of this research, a data-driven simulation model is developed with aim to generate spatial-temporal storm predicted hazards for each possible hurricane track modeled. The simulation model identifies a means to represent uncertainty in storm’s movement and its associated potential hazards in the form of probabilistic scenarios tree where each branch is associated with scenario-level storm track and weather profile. Storm hazards, such as strong winds, torrential rain, and storm surges, can inflict significant damage on the road network and affect the population’s ability to move during the storm event. A cascading network failure algorithm is introduced in the second part of the research. The algorithm takes the scenario-level storm hazards to predict uncertainties in mobility states over the storm event. In the third part of the research, a methodology is proposed to generate a sequence of actions that simultaneously solve the evacuation flow scheduling and suggested routes which minimize the total flow time, or the makespan, for the evacuation process from origins to destinations in the resulting stochastic time-dependent network. The methodology is implemented for the 2017 Hurricane Irma case study to recommend an evacuation policy for Manatee County, FL. The results are compared with evacuation plans for assumed scenarios; the research suggests that evacuation recommendations that are based on single scenarios reduce the effectiveness of the evacuation procedure. The overall contributions of the research presented here are new methodologies to: (1) predict and visualize the spatial-temporal impacts of an oncoming storm event, (2) predict uncertainties in the impacts to transportation infrastructure and mobility, and (3) determine the quickest evacuation schedule and routes under the uncertainties within the resulting stochastic transportation networks.
ContributorsGita, Ketut (Author) / Mirchandani, Pitu (Thesis advisor) / Maciejewski, Ross (Committee member) / Sefair, Jorge (Committee member) / Zhou, Xuesong (Committee member) / Arizona State University (Publisher)
Created2020