Matching Items (4)
153265-Thumbnail Image.png
Description
Corporations invest considerable resources to create, preserve and analyze

their data; yet while organizations are interested in protecting against

unauthorized data transfer, there lacks a comprehensive metric to discriminate

what data are at risk of leaking.

This thesis motivates the need for a quantitative leakage risk metric, and

provides a risk assessment system,

Corporations invest considerable resources to create, preserve and analyze

their data; yet while organizations are interested in protecting against

unauthorized data transfer, there lacks a comprehensive metric to discriminate

what data are at risk of leaking.

This thesis motivates the need for a quantitative leakage risk metric, and

provides a risk assessment system, called Whispers, for computing it. Using

unsupervised machine learning techniques, Whispers uncovers themes in an

organization's document corpus, including previously unknown or unclassified

data. Then, by correlating the document with its authors, Whispers can

identify which data are easier to contain, and conversely which are at risk.

Using the Enron email database, Whispers constructs a social network segmented

by topic themes. This graph uncovers communication channels within the

organization. Using this social network, Whispers determines the risk of each

topic by measuring the rate at which simulated leaks are not detected. For the

Enron set, Whispers identified 18 separate topic themes between January 1999

and December 2000. The highest risk data emanated from the legal department

with a leakage risk as high as 60%.
ContributorsWright, Jeremy (Author) / Syrotiuk, Violet (Thesis advisor) / Davulcu, Hasan (Committee member) / Yau, Stephen (Committee member) / Arizona State University (Publisher)
Created2014
153890-Thumbnail Image.png
Description
The recent flurry of security breaches have raised serious concerns about the security of data communication and storage. A promising way to enhance the security of the system is through physical root of trust, such as, through use of physical unclonable functions (PUF). PUF leverages the inherent randomness in physical

The recent flurry of security breaches have raised serious concerns about the security of data communication and storage. A promising way to enhance the security of the system is through physical root of trust, such as, through use of physical unclonable functions (PUF). PUF leverages the inherent randomness in physical systems to provide device specific authentication and encryption.

In this thesis, first the design of a highly reliable resistive random access memory (RRAM) PUF is presented. Compared to existing 1 cell/bit RRAM, here the sum of the read-out currents of multiple RRAM cells are used for generating one response bit. This method statistically minimizes any early-lifetime failure due to RRAM retention degradation at high temperature or under voltage stress. Using a device model that was calibrated using IMEC HfOx RRAM experimental data, it was shown that an 8 cells/bit architecture achieves 99.9999% reliability for a lifetime >10 years at 125℃ . Also, the hardware area overhead of the proposed 8 cells/bit RRAM PUF architecture was smaller than 1 cell/bit RRAM PUF that requires error correction coding to achieve the same reliability.

Next, a basic security primitive is presented, where the RRAM PUF is embedded in the cryptographic module, SHA-256. This architecture is referred to as Embedded PUF or EPUF. EPUF has a security advantage over SHA-256 as it never exposes the PUF response to the outside world. Instead, in each round, the PUF response is used to change a few bits of the message word to produce a unique message digest for each IC. The use of EPUF as a key generation module for AES is also shown. The hardware area requirement for SHA-256 and AES-128 is then analyzed using synthesis results based on TSMC 65nm library. It is shown that the area overhead of 8 cells/bit RRAM PUF is only 1.08% of the SHA-256 module and 0.04% of the AES-128 module. The security analysis of the PUF based systems is also presented. It is shown that the EPUF-based systems are resistant towards standard attacks on PUFs, and that the security of the cryptographic modules is not compromised.
ContributorsShrivastava, Ayush (Author) / Chakrabarti, Chaitali (Thesis advisor) / Yu, Shimeng (Committee member) / Cao, Yu (Committee member) / Arizona State University (Publisher)
Created2015
154895-Thumbnail Image.png
Description
Data privacy is emerging as one of the most serious concerns of big data analytics, particularly with the growing use of personal data and the ever-improving capability of data analysis. This dissertation first investigates the relation between different privacy notions, and then puts the main focus on developing economic foundations

Data privacy is emerging as one of the most serious concerns of big data analytics, particularly with the growing use of personal data and the ever-improving capability of data analysis. This dissertation first investigates the relation between different privacy notions, and then puts the main focus on developing economic foundations for a market model of trading private data.

The first part characterizes differential privacy, identifiability and mutual-information privacy by their privacy--distortion functions, which is the optimal achievable privacy level as a function of the maximum allowable distortion. The results show that these notions are fundamentally related and exhibit certain consistency: (1) The gap between the privacy--distortion functions of identifiability and differential privacy is upper bounded by a constant determined by the prior. (2) Identifiability and mutual-information privacy share the same optimal mechanism. (3) The mutual-information optimal mechanism satisfies differential privacy with a level at most a constant away from the optimal level.

The second part studies a market model of trading private data, where a data collector purchases private data from strategic data subjects (individuals) through an incentive mechanism. The value of epsilon units of privacy is measured by the minimum payment such that an individual's equilibrium strategy is to report data in an epsilon-differentially private manner. For the setting with binary private data that represents individuals' knowledge about a common underlying state, asymptotically tight lower and upper bounds on the value of privacy are established as the number of individuals becomes large, and the payment--accuracy tradeoff for learning the state is obtained. The lower bound assures the impossibility of using lower payment to buy epsilon units of privacy, and the upper bound is given by a designed reward mechanism. When the individuals' valuations of privacy are unknown to the data collector, mechanisms with possible negative payments (aiming to penalize individuals with "unacceptably" high privacy valuations) are designed to fulfill the accuracy goal and drive the total payment to zero. For the setting with binary private data following a general joint probability distribution with some symmetry, asymptotically optimal mechanisms are designed in the high data quality regime.
ContributorsWang, Weina (Author) / Ying, Lei (Thesis advisor) / Zhang, Junshan (Thesis advisor) / Scaglione, Anna (Committee member) / Zhang, Yanchao (Committee member) / Arizona State University (Publisher)
Created2016
155039-Thumbnail Image.png
Description
Access control has been historically recognized as an effective technique for ensuring that computer systems preserve important security properties. Recently, attribute-based

access control (ABAC) has emerged as a new paradigm to provide access mediation

by leveraging the concept of attributes: observable properties that become relevant under a certain security context and are

Access control has been historically recognized as an effective technique for ensuring that computer systems preserve important security properties. Recently, attribute-based

access control (ABAC) has emerged as a new paradigm to provide access mediation

by leveraging the concept of attributes: observable properties that become relevant under a certain security context and are exhibited by the entities normally involved in the mediation process, namely, end-users and protected resources. Also recently, independently-run organizations from the private and public sectors have recognized the benefits of engaging in multi-disciplinary research collaborations that involve sharing sensitive proprietary resources such as scientific data, networking capabilities and computation time and have recognized ABAC as the paradigm that suits their needs for restricting the way such resources are to be shared with each other. In such a setting, a robust yet flexible access mediation scheme is crucial to guarantee participants are granted access to such resources in a safe and secure manner.

However, no consensus exists either in the literature with respect to a formal model that clearly defines the way the components depicted in ABAC should interact with each other, so that the rigorous study of security properties to be effectively pursued. This dissertation proposes an approach tailored to provide a well-defined and formal definition of ABAC, including a description on how attributes exhibited by different independent organizations are to be leveraged for mediating access to shared resources, by allowing for collaborating parties to engage in federations for the specification, discovery, evaluation and communication of attributes, policies, and access mediation decisions. In addition, a software assurance framework is introduced to support the correct construction of enforcement mechanisms implementing our approach by leveraging validation and verification techniques based on software assertions, namely, design by contract (DBC) and behavioral interface specification languages (BISL). Finally, this dissertation also proposes a distributed trust framework that allows for exchanging recommendations on the perceived reputations of members of our proposed federations, in such a way that the level of trust of previously-unknown participants can be properly assessed for the purposes of access mediation.
ContributorsRubio Medrano, Carlos Ernesto (Author) / Ahn, Gail-Joon (Thesis advisor) / Doupe, Adam (Committee member) / Zhao, Ziming (Committee member) / Santanam, Raghu (Committee member) / Huang, Dijiang (Committee member) / Arizona State University (Publisher)
Created2016