Matching Items (1,098)
Filtering by

Clear all filters

Description
Virtual Reality (hereafter VR) and Mixed Reality (hereafter MR) have opened a new line of applications and possibilities. Amidst a vast network of potential applications, little research has been done to provide real time collaboration capability between users of VR and MR. The idea of this thesis study is to

Virtual Reality (hereafter VR) and Mixed Reality (hereafter MR) have opened a new line of applications and possibilities. Amidst a vast network of potential applications, little research has been done to provide real time collaboration capability between users of VR and MR. The idea of this thesis study is to develop and test a real time collaboration system between VR and MR. The system works similar to a Google document where two or more users can see what others are doing i.e. writing, modifying, viewing, etc. Similarly, the system developed during this study will enable users in VR and MR to collaborate in real time.

The study of developing a real-time cross-platform collaboration system between VR and MR takes into consideration a scenario in which multiple device users are connected to a multiplayer network where they are guided to perform various tasks concurrently.

Usability testing was conducted to evaluate participant perceptions of the system. Users were required to assemble a chair in alternating turns; thereafter users were required to fill a survey and give an audio interview. Results collected from the participants showed positive feedback towards using VR and MR for collaboration. However, there are several limitations with the current generation of devices that hinder mass adoption. Devices with better performance factors will lead to wider adoption.
ContributorsSeth, Nayan Sateesh (Author) / Nelson, Brian (Thesis advisor) / Walker, Erin (Committee member) / Atkinson, Robert (Committee member) / Arizona State University (Publisher)
Created2017
Description
Alzheimer’s disease (AD), is a chronic neurodegenerative disease that usually starts slowly and gets worse over time. It is the cause of 60% to 70% of cases of dementia. There is growing interest in identifying brain image biomarkers that help evaluate AD risk pre-symptomatically. High-dimensional non-linear pattern classification methods have

Alzheimer’s disease (AD), is a chronic neurodegenerative disease that usually starts slowly and gets worse over time. It is the cause of 60% to 70% of cases of dementia. There is growing interest in identifying brain image biomarkers that help evaluate AD risk pre-symptomatically. High-dimensional non-linear pattern classification methods have been applied to structural magnetic resonance images (MRI’s) and used to discriminate between clinical groups in Alzheimers progression. Using Fluorodeoxyglucose (FDG) positron emission tomography (PET) as the pre- ferred imaging modality, this thesis develops two independent machine learning based patch analysis methods and uses them to perform six binary classification experiments across different (AD) diagnostic categories. Specifically, features were extracted and learned using dimensionality reduction and dictionary learning & sparse coding by taking overlapping patches in and around the cerebral cortex and using them as fea- tures. Using AdaBoost as the preferred choice of classifier both methods try to utilize 18F-FDG PET as a biological marker in the early diagnosis of Alzheimer’s . Addi- tional we investigate the involvement of rich demographic features (ApoeE3, ApoeE4 and Functional Activities Questionnaires (FAQ)) in classification. The experimental results on Alzheimer’s Disease Neuroimaging initiative (ADNI) dataset demonstrate the effectiveness of both the proposed systems. The use of 18F-FDG PET may offer a new sensitive biomarker and enrich the brain imaging analysis toolset for studying the diagnosis and prognosis of AD.
ContributorsSrivastava, Anant (Author) / Wang, Yalin (Thesis advisor) / Bansal, Ajay (Thesis advisor) / Liang, Jianming (Committee member) / Arizona State University (Publisher)
Created2017
155186-Thumbnail Image.png
Description
In this era of high-tech computer advancements and tremendous programmable computer capabilities, construction cost estimation still remains a knowledge-intensive and experience driven task. High reliance on human expertise, and less accuracy in the decision support tools render cost estimation error prone. Arriving at accurate cost estimates is of paramount importance

In this era of high-tech computer advancements and tremendous programmable computer capabilities, construction cost estimation still remains a knowledge-intensive and experience driven task. High reliance on human expertise, and less accuracy in the decision support tools render cost estimation error prone. Arriving at accurate cost estimates is of paramount importance because it forms the basis of most of the financial, design, and executive decisions concerning the project at subsequent stages. As its unique contribution to the body of knowledge, this paper analyzes the deviations and behavior of costs associated with different construction activities involved in commercial office tenant improvement (TI) projects. The aim of this study is to obtain useful micro-level cost information of various construction activities that make up for the total construction cost of projects. Standardization and classification of construction activities have been carried out based on Construction Specifications Institute’s (CSI) MasterFormat® division items. Construction costs from 51 office TI projects completed during 2015 and 2016 are analyzed statistically to understand the trends among various construction activities involved. It was found that the interior finishes activities showed a much higher cost of construction, and a comparatively higher variation than the mechanical, electrical, and plumbing (MEP) trades. The statistical analysis also revealed a huge scope of energy saving measures that could be achieved in such TI projects because of the absence of energy management systems (EMS) found in 66% of the projects.
ContributorsGhosh, Arunabho (Author) / Grau, David (Thesis advisor) / Ayer, Steven (Committee member) / Parrish, Kristen (Committee member) / Arizona State University (Publisher)
Created2016
155191-Thumbnail Image.png
Description
Identifying chemical compounds that inhibit bacterial infection has recently gained a considerable amount of attention given the increased number of highly resistant bacteria and the serious health threat it poses around the world. With the development of automated microscopy and image analysis systems, the process of identifying novel therapeutic drugs

Identifying chemical compounds that inhibit bacterial infection has recently gained a considerable amount of attention given the increased number of highly resistant bacteria and the serious health threat it poses around the world. With the development of automated microscopy and image analysis systems, the process of identifying novel therapeutic drugs can generate an immense amount of data - easily reaching terabytes worth of information. Despite increasing the vast amount of data that is currently generated, traditional analytical methods have not increased the overall success rate of identifying active chemical compounds that eventually become novel therapeutic drugs. Moreover, multispectral imaging has become ubiquitous in drug discovery due to its ability to provide valuable information on cellular and sub-cellular processes using florescent reagents. These reagents are often costly and toxic to cells over an extended period of time causing limitations in experimental design. Thus, there is a significant need to develop a more efficient process of identifying active chemical compounds.

This dissertation introduces novel machine learning methods based on parallelized cellomics to analyze interactions between cells, bacteria, and chemical compounds while reducing the use of fluorescent reagents. Machine learning analysis using image-based high-content screening (HCS) data is compartmentalized into three primary components: (1) \textit{Image Analytics}, (2) \textit{Phenotypic Analytics}, and (3) \textit{Compound Analytics}. A novel software analytics tool called the Insights project is also introduced. The Insights project fully incorporates distributed processing, high performance computing, and database management that can rapidly and effectively utilize and store massive amounts of data generated using HCS biological assessments (bioassays). It is ideally suited for parallelized cellomics in high dimensional space.

Results demonstrate that a parallelized cellomics approach increases the quality of a bioassay while vastly decreasing the need for control data. The reduction in control data leads to less fluorescent reagent consumption. Furthermore, a novel proposed method that uses single-cell data points is proven to identify known active chemical compounds with a high degree of accuracy, despite traditional quality control measurements indicating the bioassay to be of poor quality. This, ultimately, decreases the time and resources needed in optimizing bioassays while still accurately identifying active compounds.
ContributorsTrevino, Robert (Author) / Liu, Huan (Thesis advisor) / Lamkin, Thomas J (Committee member) / He, Jingrui (Committee member) / Lee, Joohyung (Committee member) / Arizona State University (Publisher)
Created2016
155200-Thumbnail Image.png
Description
Affect signals what humans care about and is involved in rational decision-making and action selection. Many technologies may be improved by the capability to recognize human affect and to respond adaptively by appropriately modifying their operation. This capability, named affect-driven self-adaptation, benefits systems as diverse as learning environments, healthcare applications,

Affect signals what humans care about and is involved in rational decision-making and action selection. Many technologies may be improved by the capability to recognize human affect and to respond adaptively by appropriately modifying their operation. This capability, named affect-driven self-adaptation, benefits systems as diverse as learning environments, healthcare applications, and video games, and indeed has the potential to improve systems that interact intimately with users across all sectors of society. The main challenge is that existing approaches to advancing affect-driven self-adaptive systems typically limit their applicability by supporting the creation of one-of-a-kind systems with hard-wired affect recognition and self-adaptation capabilities, which are brittle, costly to change, and difficult to reuse. A solution to this limitation is to leverage the development of affect-driven self-adaptive systems with a manufacturing vision.

This dissertation demonstrates how using a software product line paradigm can jumpstart the development of affect-driven self-adaptive systems with that manufacturing vision. Applying a software product line approach to the affect-driven self-adaptive domain provides a comprehensive, flexible and reusable infrastructure of components with mechanisms to monitor a user’s affect and his/her contextual interaction with a system, to detect opportunities for improvements, to select a course of action, and to effect changes. It also provides a domain-specific architecture and well-documented process guidelines, which facilitate an understanding of the organization of affect-driven self-adaptive systems and their implementation by systematically customizing the infrastructure to effectively address the particular requirements of specific systems.

The software product line approach is evaluated by applying it in the development of learning environments and video games that demonstrate the significant potential of the solution, across diverse development scenarios and applications.

The key contributions of this work include extending self-adaptive system modeling, implementing a reusable infrastructure, and leveraging the use of patterns to exploit the commonalities between systems in the affect-driven self-adaptation domain.
ContributorsGonzalez-Sanchez, Javier (Author) / Burleson, Winslow (Thesis advisor) / Collofello, James (Thesis advisor) / Garlan, David (Committee member) / Sarjoughian, Hessam S. (Committee member) / Atkinson, Robert (Committee member) / Arizona State University (Publisher)
Created2016
155205-Thumbnail Image.png
Description
When software design teams attempt to collaborate on different design docu-

ments they suffer from a serious collaboration problem. Designers collaborate either in person or remotely. In person collaboration is expensive but effective. Remote collaboration is inexpensive but inefficient. In, order to gain the most benefit from collaboration there needs to

When software design teams attempt to collaborate on different design docu-

ments they suffer from a serious collaboration problem. Designers collaborate either in person or remotely. In person collaboration is expensive but effective. Remote collaboration is inexpensive but inefficient. In, order to gain the most benefit from collaboration there needs to be remote collaboration that is not only cheap but also as efficient as physical collaboration.

Remotely collaborating on software design relies on general tools such as Word, and Excel. These tools are then shared in an inefficient manner by using either email, cloud based file locking tools, or something like google docs. Because these tools either increase the number of design building blocks, or limit the number

of available times in which one can work on a specific document, they drastically decrease productivity.

This thesis outlines a new methodology to increase design productivity, accom- plished by providing design specific collaboration. Using version control systems, this methodology allows for effective project collaboration between remotely lo- cated design teams. The methodology of this paper encompasses role management, policy management, and design artifact management, including nonfunctional re- quirements. Version control can be used for different design products, improving communication and productivity amongst design teams. This thesis outlines this methodology and then outlines a proof of concept tool that embodies the core of these principles.
ContributorsPike, Shawn (Author) / Gaffar, Ashraf (Thesis advisor) / Lindquist, Timothy (Committee member) / Whitehouse, Richard (Committee member) / Arizona State University (Publisher)
Created2016
Description
This dissertation uses a comparative approach to investigate long-term human- environment interrelationships in times of climate change. It uses Geographical Information Systems and ecological models to reconstruct the Magdalenian (~20,000- 14,000 calibrated years ago) environments of the coastal mountainous zone of Cantabria (Northwest Spain) and the interior valleys of the

This dissertation uses a comparative approach to investigate long-term human- environment interrelationships in times of climate change. It uses Geographical Information Systems and ecological models to reconstruct the Magdalenian (~20,000- 14,000 calibrated years ago) environments of the coastal mountainous zone of Cantabria (Northwest Spain) and the interior valleys of the Dordogne (Southwest France) to contextualize the social networks that could have formed during a time of high climate and resource variability. It simulates the formation of such networks in an agent-based model, which documents the processes underlying the formation of archaeological assemblages, and evaluates the potential impacts of climate-topography interactions on cultural transmission. This research then reconstructs the Magdalenian social networks visible through a multivariate statistical analysis of stylistic similarities among portable art objects. As these networks cannot be analyzed directly to infer social behavior, their characteristics are compared to the results of the agent-based model, which provide characteristics estimates of the Magdalenian latent social networks that most likely produced the empirical archaeological assemblage studied.

This research contributes several new results, most of which point to the advantages of using an inter-disciplinary approach to the study of the archaeological record. It demonstrates the benefits of using an agent-based model to parse social data from long- term palimpsests. It shows that geographical and environmental contexts affect the structure of social networks, which in turn affects the transmission of ideas and goods that flow through it. This shows the presence of human-environment interactions that not only affected our ancestors’ reaction to resource insecurities, but also led them to innovate and improve the productivity of their own environment. However, it also suggests that such alterations may have reduced the populations’ resilience to strong climatic changes, and that the region with diverse resources provided a more stable and resilient environment than the region transformed to satisfy the immediate needs of its population.
ContributorsGravel-Miguel, Claudine (Author) / Barton, C. Michael (Thesis advisor) / Coudart, Anick (Committee member) / Clark, Geoffrey A. (Committee member) / Arizona State University (Publisher)
Created2017
154558-Thumbnail Image.png
Description
Feature learning and the discovery of nonlinear variation patterns in high-dimensional data is an important task in many problem domains, such as imaging, streaming data from sensors, and manufacturing. This dissertation presents several methods for learning and visualizing nonlinear variation in high-dimensional data. First, an automated method for discovering nonlinear

Feature learning and the discovery of nonlinear variation patterns in high-dimensional data is an important task in many problem domains, such as imaging, streaming data from sensors, and manufacturing. This dissertation presents several methods for learning and visualizing nonlinear variation in high-dimensional data. First, an automated method for discovering nonlinear variation patterns using deep learning autoencoders is proposed. The approach provides a functional mapping from a low-dimensional representation to the original spatially-dense data that is both interpretable and efficient with respect to preserving information. Experimental results indicate that deep learning autoencoders outperform manifold learning and principal component analysis in reproducing the original data from the learned variation sources.

A key issue in using autoencoders for nonlinear variation pattern discovery is to encourage the learning of solutions where each feature represents a unique variation source, which we define as distinct features. This problem of learning distinct features is also referred to as disentangling factors of variation in the representation learning literature. The remainder of this dissertation highlights and provides solutions for this important problem.

An alternating autoencoder training method is presented and a new measure motivated by orthogonal loadings in linear models is proposed to quantify feature distinctness in the nonlinear models. Simulated point cloud data and handwritten digit images illustrate that standard training methods for autoencoders consistently mix the true variation sources in the learned low-dimensional representation, whereas the alternating method produces solutions with more distinct patterns.

Finally, a new regularization method for learning distinct nonlinear features using autoencoders is proposed. Motivated in-part by the properties of linear solutions, a series of learning constraints are implemented via regularization penalties during stochastic gradient descent training. These include the orthogonality of tangent vectors to the manifold, the correlation between learned features, and the distributions of the learned features. This regularized learning approach yields low-dimensional representations which can be better interpreted and used to identify the true sources of variation impacting a high-dimensional feature space. Experimental results demonstrate the effectiveness of this method for nonlinear variation pattern discovery on both simulated and real data sets.
ContributorsHoward, Phillip (Author) / Runger, George C. (Thesis advisor) / Montgomery, Douglas C. (Committee member) / Mirchandani, Pitu (Committee member) / Apley, Daniel (Committee member) / Arizona State University (Publisher)
Created2016
154589-Thumbnail Image.png
Description
Bank institutions employ several marketing strategies to maximize new customer acquisition as well as current customer retention. Telemarketing is one such approach taken where individual customers are contacted by bank representatives with offers. These telemarketing strategies can be improved in combination with data mining techniques that allow predictability

Bank institutions employ several marketing strategies to maximize new customer acquisition as well as current customer retention. Telemarketing is one such approach taken where individual customers are contacted by bank representatives with offers. These telemarketing strategies can be improved in combination with data mining techniques that allow predictability of customer information and interests. In this thesis, bank telemarketing data from a Portuguese banking institution were analyzed to determine predictability of several client demographic and financial attributes and find most contributing factors in each. Data were preprocessed to ensure quality, and then data mining models were generated for the attributes with logistic regression, support vector machine (SVM) and random forest using Orange as the data mining tool. Results were analyzed using precision, recall and F1 score.
ContributorsEjaz, Samira (Author) / Davulcu, Hasan (Thesis advisor) / Balasooriya, Janaka (Committee member) / Candan, Kasim (Committee member) / Arizona State University (Publisher)
Created2016
154694-Thumbnail Image.png
Description
Despite incremental improvements over decades, academic planning solutions see relatively little use in many industrial domains despite the relevance of planning paradigms to those problems. This work observes four shortfalls of existing academic solutions which contribute to this lack of adoption.

To address these shortfalls this work defines model-independent semantics for

Despite incremental improvements over decades, academic planning solutions see relatively little use in many industrial domains despite the relevance of planning paradigms to those problems. This work observes four shortfalls of existing academic solutions which contribute to this lack of adoption.

To address these shortfalls this work defines model-independent semantics for planning and introduces an extensible planning library. This library is shown to produce feasible results on an existing benchmark domain, overcome the usual modeling limitations of traditional planners, and accommodate domain-dependent knowledge about the problem structure within the planning process.
ContributorsJonas, Michael (Author) / Gaffar, Ashraf (Thesis advisor) / Fainekos, Georgios (Committee member) / Doupe, Adam (Committee member) / Herley, Cormac (Committee member) / Arizona State University (Publisher)
Created2016