This collection includes both ASU Theses and Dissertations, submitted by graduate students, and the Barrett, Honors College theses submitted by undergraduate students. 

Displaying 1 - 10 of 52
153335-Thumbnail Image.png
Description
With the increasing user demand for low latency, elastic provisioning of computing resources coupled with ubiquitous and on-demand access to real-time data, cloud computing has emerged as a popular computing paradigm to meet growing user demands.

With the increasing user demand for low latency, elastic provisioning of computing resources coupled with ubiquitous and on-demand access to real-time data, cloud computing has emerged as a popular computing paradigm to meet growing user demands. However, with the introduction and rising use of wear- able technology and evolving uses of smart-phones, the concept of Internet of Things (IoT) has become a prevailing notion in the currently growing technology industry. Cisco Inc. has projected a data creation of approximately 403 Zetabytes (ZB) by 2018. The combination of bringing benign devices and connecting them to the web has resulted in exploding service and data aggregation requirements, thus requiring a new and innovative computing platform. This platform should have the capability to provide robust real-time data analytics and resource provisioning to clients, such as IoT users, on-demand. Such a computation model would need to function at the edge-of-the-network, forming a bridge between the large cloud data centers and the distributed connected devices.

This research expands on the notion of bringing computational power to the edge- of-the-network, and then integrating it with the cloud computing paradigm whilst providing services to diverse IoT-based applications. This expansion is achieved through the establishment of a new computing model that serves as a platform for IoT-based devices to communicate with services in real-time. We name this paradigm as Gateway-Oriented Reconfigurable Ecosystem (GORE) computing. Finally, this thesis proposes and discusses the development of a policy management framework for accommodating our proposed computational paradigm. The policy framework is designed to serve both the hosted applications and the GORE paradigm by enabling them to function more efficiently. The goal of the framework is to ensure uninterrupted communication and service delivery between users and their applications.
ContributorsDsouza, Clinton (Author) / Ahn, Gail-Joon (Thesis advisor) / Doupe, Adam (Committee member) / Dasgupta, Partha (Committee member) / Arizona State University (Publisher)
Created2015
153094-Thumbnail Image.png
Description
Android is currently the most widely used mobile operating system. The permission model in Android governs the resource access privileges of applications. The permission model however is amenable to various attacks, including re-delegation attacks, background snooping attacks and disclosure of private information. This thesis is aimed at understanding, analyzing and

Android is currently the most widely used mobile operating system. The permission model in Android governs the resource access privileges of applications. The permission model however is amenable to various attacks, including re-delegation attacks, background snooping attacks and disclosure of private information. This thesis is aimed at understanding, analyzing and performing forensics on application behavior. This research sheds light on several security aspects, including the use of inter-process communications (IPC) to perform permission re-delegation attacks.

Android permission system is more of app-driven rather than user controlled, which means it is the applications that specify their permission requirement and the only thing which the user can do is choose not to install a particular application based on the requirements. Given the all or nothing choice, users succumb to pressures and needs to accept permissions requested. This thesis proposes a couple of ways for providing the users finer grained control of application privileges. The same methods can be used to evade the Permission Re-delegation attack.

This thesis also proposes and implements a novel methodology in Android that can be used to control the access privileges of an Android application, taking into consideration the context of the running application. This application-context based permission usage is further used to analyze a set of sample applications. We found the evidence of applications spoofing or divulging user sensitive information such as location information, contact information, phone id and numbers, in the background. Such activities can be used to track users for a variety of privacy-intrusive purposes. We have developed implementations that minimize several forms of privacy leaks that are routinely done by stock applications.
ContributorsGollapudi, Narasimha Aditya (Author) / Dasgupta, Partha (Thesis advisor) / Xue, Guoliang (Committee member) / Doupe, Adam (Committee member) / Arizona State University (Publisher)
Created2014
153126-Thumbnail Image.png
Description
The increasing number of continually connected mobile persons has created an environment conducive to real time user data gathering for many uses both public and private in nature. Publicly, one can envision no longer requiring a census to determine the demographic composition of the country and its sub regions. The

The increasing number of continually connected mobile persons has created an environment conducive to real time user data gathering for many uses both public and private in nature. Publicly, one can envision no longer requiring a census to determine the demographic composition of the country and its sub regions. The information provided is vastly more up to date than that of a census and allows civil authorities to be more agile and preemptive with planning. Privately, advertisers take advantage of a persons stated opinions, demographics, and contextual (where and when) information in order to formulate and present pertinent offers.

Regardless of its use this information can be sensitive in nature and should therefore be under the control of the user. Currently, a user has little say in the manner that their information is processed once it has been released. An ad-hoc approach is currently in use, where the location based service providers each maintain their own policy over personal information usage.

In order to allow more user control over their personal information while still providing for targeted advertising, a systematic approach to the release of the information is needed. It is for that reason we propose a User-Centric Context Aware Spatiotemporal Anonymization framework. At its core the framework will unify the current spatiotemporal anonymization with that of traditional anonymization so that user specified anonymization requirement is met or exceeded while allowing for more demographic information to be released.
ContributorsSanchez, Michael Andrew (Author) / Ahn, Gail-Joon (Thesis advisor) / Doupe, Adam (Committee member) / Dasgupta, Partha (Committee member) / Arizona State University (Publisher)
Created2014
153147-Thumbnail Image.png
Description
The rate at which new malicious software (Malware) is created is consistently increasing each year. These new malwares are designed to bypass the current anti-virus countermeasures employed to protect computer systems. Security Analysts must understand the nature and intent of the malware sample in order to protect computer systems from

The rate at which new malicious software (Malware) is created is consistently increasing each year. These new malwares are designed to bypass the current anti-virus countermeasures employed to protect computer systems. Security Analysts must understand the nature and intent of the malware sample in order to protect computer systems from these attacks. The large number of new malware samples received daily by computer security companies require Security Analysts to quickly determine the type, threat, and countermeasure for newly identied samples. Our approach provides for a visualization tool to assist the Security Analyst in these tasks that allows the Analyst to visually identify relationships between malware samples.

This approach consists of three steps. First, the received samples are processed by a sandbox environment to perform a dynamic behavior analysis. Second, the reports of the dynamic behavior analysis are parsed to extract identifying features which are matched against other known and analyzed samples. Lastly, those matches that are determined to express a relationship are visualized as an edge connected pair of nodes in an undirected graph.
ContributorsHolmes, James Edward (Author) / Ahn, Gail-Joon (Thesis advisor) / Dasgupta, Partha (Committee member) / Doupe, Adam (Committee member) / Arizona State University (Publisher)
Created2014
154172-Thumbnail Image.png
Description
Due to the shortcomings of modern Mobile Device Management solutions, businesses

have begun to incorporate forensics to analyze their mobile devices and respond

to any incidents of malicious activity in order to protect their sensitive data. Current

forensic tools, however, can only look a static image of the device being examined,

making it difficult

Due to the shortcomings of modern Mobile Device Management solutions, businesses

have begun to incorporate forensics to analyze their mobile devices and respond

to any incidents of malicious activity in order to protect their sensitive data. Current

forensic tools, however, can only look a static image of the device being examined,

making it difficult for a forensic analyst to produce conclusive results regarding the

integrity of any sensitive data on the device. This research thesis expands on the

use of forensics to secure data by implementing an agent on a mobile device that can

continually collect information regarding the state of the device. This information is

then sent to a separate server in the form of log files to be analyzed using a specialized

tool. The analysis tool is able to look at the data collected from the device over time

and perform specific calculations, according to the user's specifications, highlighting

any correlations or anomalies among the data which might be considered suspicious

to a forensic analyst. The contribution of this paper is both an in-depth explanation

on the implementation of an iOS application to be used to improve the mobile forensics

process as well as a proof-of-concept experiment showing how evidence collected

over time can be used to improve the accuracy of a forensic analysis.
ContributorsWhitaker, Jeremy (Author) / Ahn, Gail-Joon (Thesis advisor) / Doupe, Adam (Committee member) / Yau, Stephen (Committee member) / Arizona State University (Publisher)
Created2015
154187-Thumbnail Image.png
Description
Widespread adoption of smartphone based Mobile Medical Apps (MMAs) is opening new avenues for innovation, bringing MMAs to the forefront of low cost healthcare delivery. These apps often control human physiology and work on sensitive data. Thus it is necessary to have evidences of their trustworthiness i.e. maintaining privacy of

Widespread adoption of smartphone based Mobile Medical Apps (MMAs) is opening new avenues for innovation, bringing MMAs to the forefront of low cost healthcare delivery. These apps often control human physiology and work on sensitive data. Thus it is necessary to have evidences of their trustworthiness i.e. maintaining privacy of health data, long term operation of wearable sensors and ensuring no harm to the user before actual marketing. Traditionally, clinical studies are used to validate the trustworthiness of medical systems. However, they can take long time and could potentially harm the user. Such evidences can be generated using simulations and mathematical analysis. These methods involve estimating the MMA interactions with human physiology. However, the nonlinear nature of human physiology makes the estimation challenging.

This research analyzes and develops MMA software while considering its interactions with human physiology to assure trustworthiness. A novel app development methodology is used to objectively evaluate trustworthiness of a MMA by generating evidences using automatic techniques. It involves developing the Health-Dev β tool to generate a) evidences of trustworthiness of MMAs and b) requirements assured code generation for vulnerable components of the MMA without hindering the app development process. In this method, all requests from MMAs pass through a trustworthy entity, Trustworthy Data Manager which checks if the app request satisfies the MMA requirements. This method is intended to expedite the design to marketing process of MMAs. The objectives of this research is to develop models, tools and theory for evidence generation and can be divided into the following themes:

• Sustainable design configuration estimation of MMAs: Developing an optimization framework which can generate sustainable and safe sensor configuration while considering interactions of the MMA with the environment.

• Evidence generation using simulation and formal methods: Developing models and tools to verify safety properties of the MMA design to ensure no harm to the human physiology.

• Automatic code generation for MMAs: Investigating methods for automatically

• Performance analysis of trustworthy data manager: Evaluating response time generating trustworthy software for vulnerable components of a MMA and evidences.performance of trustworthy data manager under interactions from non-MMA smartphone apps.
ContributorsBagade, Priyanka (Author) / Gupta, Sandeep K. S. (Thesis advisor) / Wu, Carole-Jean (Committee member) / Doupe, Adam (Committee member) / Zhang, Yi (Committee member) / Arizona State University (Publisher)
Created2015
156002-Thumbnail Image.png
Description
Hardware-Assisted Security (HAS) is an emerging technology that addresses the shortcomings of software-based virtualized environment. There are two major weaknesses of software-based virtualization that HAS attempts to address - performance overhead and security issues. Performance overhead caused by software-based virtualization is due to the use of additional software layer (i.e.,

Hardware-Assisted Security (HAS) is an emerging technology that addresses the shortcomings of software-based virtualized environment. There are two major weaknesses of software-based virtualization that HAS attempts to address - performance overhead and security issues. Performance overhead caused by software-based virtualization is due to the use of additional software layer (i.e., hypervisor). Since the performance is highly related to efficiency of processing data and providing services, reducing performance overhead is one of the major concerns in data centers and enterprise networks. Software-based virtualization also imposes additional security issues in the virtualized environments. To resolve those issues, HAS is developed to offload security functions from application layer to a dedicated hardware, thereby achieving almost bare-metal performance and enhanced security. As a result, HAS gained

more popularity and the number of studies regarding efficiency of the technology is increasing.

However, there exists no attempt to our knowledge that provides a generic test mechanism that is universally applicable to all HAS devices. Preparing such a testbed for each specific HAS device is a time-consuming and costly task for hardware manufacturers and network administrators. Therefore, we try to address the demands of hardware vendors and researchers for a generic testbed that can evaluate both performance and security functions of the HAS-enabled systems.

In this thesis, the HAS device evaluation framework (HEF) is defined for hardware vendors, network administrators, and researchers to measure performance of the system with HAS devices. HEF provides a generic test environments for a given HAS device by providing generic test metrics and evaluation mechanisms. HEF is also designed to take user-defined test metrics and test cases to support various hardware. The framework performs the entire process in an automated fashion, and thus it requires no user intervention. Finally, the efficacy of HEF is demonstrated by performing a case study using Intel QuickAssist Technology (QAT) adapter, which is a dedicated PCI express device for cryptographic tasks.
ContributorsKyung, Sukwha (Author) / Ahn, Gail-Joon (Thesis advisor) / Doupe, Adam (Committee member) / Zhao, Ziming (Committee member) / Arizona State University (Publisher)
Created2017
155925-Thumbnail Image.png
Description
A Virtual Private Network (VPN) is the traditional approach for an end-to-end secure connection between two endpoints. Most existing VPN solutions are intended for wired networks with reliable connections. In a mobile environment, network connections are less reliable and devices experience intermittent network disconnections due to either switching from one

A Virtual Private Network (VPN) is the traditional approach for an end-to-end secure connection between two endpoints. Most existing VPN solutions are intended for wired networks with reliable connections. In a mobile environment, network connections are less reliable and devices experience intermittent network disconnections due to either switching from one network to another or experiencing a gap in coverage during roaming. These disruptive events affects traditional VPN performance, resulting in possible termination of applications, data loss, and reduced productivity. Mobile VPNs bridge the gap between what users and applications expect from a wired network and the realities of mobile computing.

In this dissertation, MobiVPN, which was built by modifying the widely-used OpenVPN so that the requirements of a mobile VPN were met, was designed and developed. The aim in MobiVPN was for it to be a reliable and efficient VPN for mobile environments. In order to achieve these objectives, MobiVPN introduces the following features: 1) Fast and lightweight VPN session resumption, where MobiVPN is able decrease the time it takes to resume a VPN tunnel after a mobility event by an average of 97.19\% compared to that of OpenVPN. 2) Persistence of TCP sessions of the tunneled applications allowing them to survive VPN tunnel disruptions due to a gap in network coverage no matter how long the coverage gap is. MobiVPN also has mechanisms to suspend and resume TCP flows during and after a network disconnection with a packet buffering option to maintain the TCP sending rate. MobiVPN was able to provide fast resumption of TCP flows after reconnection with improved TCP performance when multiple disconnections occur with an average of 30.08\% increase in throughput in the experiments where buffering was used, and an average of 20.93\% of increased throughput for flows that were not buffered. 3) A fine-grained, flow-based adaptive compression which allows MobiVPN to treat each tunneled flow independently so that compression can be turned on for compressible flows, and turned off for incompressible ones. The experiments showed that the flow-based adaptive compression outperformed OpenVPN's compression options in terms of effective throughput, data reduction, and lesser compression operations.
ContributorsAlshalan, Abdullah O. (Author) / Huang, Dijiang (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Doupe, Adam (Committee member) / Zhang, Yanchao (Committee member) / Arizona State University (Publisher)
Created2017
156290-Thumbnail Image.png
Description
Data breaches have been on a rise and financial sector is among the top targeted. It can take a few months and upto a few years to identify the occurrence of a data breach. A major motivation behind data breaches is financial gain, hence most of the data ends u

Data breaches have been on a rise and financial sector is among the top targeted. It can take a few months and upto a few years to identify the occurrence of a data breach. A major motivation behind data breaches is financial gain, hence most of the data ends up being on sale on the darkweb websites. It is important to identify sale of such stolen information on a timely and relevant manner. In this research, we present a system for timely identification of sale of stolen data on darkweb websites. We frame identifying sale of stolen data as a multi-label classification problem and leverage several machine learning approaches based on the thread content (textual) and social network analysis of the user communication seen on darkweb websites. The system generates alerts about trends based on popularity amongst the users of such websites. We evaluate our system using the K-fold cross validation as well as manual evaluation of blind (unseen) data. The method of combining social network and textual features outperforms baseline method i.e only using textual features, by 15 to 20 % improved precision. The alerts provide a good insight and we illustrate our findings by cases studies of the results.
ContributorsDharaiya, Krishna Tushar (Author) / Shakarian, Paulo (Thesis advisor) / Doupe, Adam (Committee member) / Shoshitaishvili, Yan (Committee member) / Arizona State University (Publisher)
Created2018
156125-Thumbnail Image.png
Description
In this research, I try to solve multi-class multi-label classication problem, where

the goal is to automatically assign one or more labels(tags) to discussion topics seen

in deepweb. I observed natural hierarchy in our dataset, and I used dierent

techniques to ensure hierarchical integrity constraint on the predicted tag list. To

solve `class imbalance'

In this research, I try to solve multi-class multi-label classication problem, where

the goal is to automatically assign one or more labels(tags) to discussion topics seen

in deepweb. I observed natural hierarchy in our dataset, and I used dierent

techniques to ensure hierarchical integrity constraint on the predicted tag list. To

solve `class imbalance' and `scarcity of labeled data' problems, I developed semisupervised

model based on elastic search(ES) document relevance score. I evaluate

our models using standard K-fold cross-validation method. Ensuring hierarchical

integrity constraints improved F1 score by 11.9% over standard supervised learning,

while our ES based semi-supervised learning model out-performed other models in

terms of precision(78.4%) score while maintaining comparable recall(21%) score.
ContributorsPatil, Revanth (Author) / Shakarian, Paulo (Thesis advisor) / Doupe, Adam (Committee member) / Davulcu, Hasan (Committee member) / Arizona State University (Publisher)
Created2018